Given a Bank customer, build a neural network-based classifier that can determine whether they will leave or not in the next 6 months
CustomerId: Unique ID which is assigned to each customerSurname: Last name of the customerCreditScore: It defines the credit history of the customerGeography: A customer’s location Gender: It defines the Gender of the customer Age: Age of the customer Tenure: Number of years for which the customer has been with the bankNumOfProducts: It refers to the number of products that a customer has purchased through the bankBalance: Account balanceHasCrCard: It is a categorical variable that decides whether the customer has a credit card or not.EstimatedSalary: Estimated salary isActiveMember: It is a categorical variable that decides whether the customer is an active member of the bank or not ( Active member in the sense, using bank products regularly, making transactions, etc )Excited: It is a categorical variable that decides whether the customer left the bank within six months or not. It can take two valuesimport warnings
warnings.filterwarnings("ignore")
import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
import pandas_profiling
sns.set(color_codes=True)
%matplotlib inline
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn import metrics
from sklearn.metrics import f1_score,accuracy_score, recall_score, precision_score, roc_auc_score, roc_curve, confusion_matrix, precision_recall_curve
from sklearn.utils import class_weight
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout
from tensorflow.keras.optimizers import Adam
import tensorflow as tf
# read data from csv file
data = pd.read_csv('bank.csv')
# get columns
data.columns
Index(['RowNumber', 'CustomerId', 'Surname', 'CreditScore', 'Geography',
'Gender', 'Age', 'Tenure', 'Balance', 'NumOfProducts', 'HasCrCard',
'IsActiveMember', 'EstimatedSalary', 'Exited'],
dtype='object')
# get size of dataset
data.shape
(10000, 14)
# check dataset information
data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 10000 entries, 0 to 9999 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RowNumber 10000 non-null int64 1 CustomerId 10000 non-null int64 2 Surname 10000 non-null object 3 CreditScore 10000 non-null int64 4 Geography 10000 non-null object 5 Gender 10000 non-null object 6 Age 10000 non-null int64 7 Tenure 10000 non-null int64 8 Balance 10000 non-null float64 9 NumOfProducts 10000 non-null int64 10 HasCrCard 10000 non-null int64 11 IsActiveMember 10000 non-null int64 12 EstimatedSalary 10000 non-null float64 13 Exited 10000 non-null int64 dtypes: float64(2), int64(9), object(3) memory usage: 1.1+ MB
# check dataset missing values
total = data.isnull().sum().sort_values(ascending=False) # total number of null values
print(total)
RowNumber 0 CustomerId 0 Surname 0 CreditScore 0 Geography 0 Gender 0 Age 0 Tenure 0 Balance 0 NumOfProducts 0 HasCrCard 0 IsActiveMember 0 EstimatedSalary 0 Exited 0 dtype: int64
# check for duplicates
data.duplicated().sum()
0
This first assessment of the dataset shows:
# check first rows of data
data.head()
| RowNumber | CustomerId | Surname | CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 15634602 | Hargrave | 619 | France | Female | 42 | 2 | 0.00 | 1 | 1 | 1 | 101348.88 | 1 |
| 1 | 2 | 15647311 | Hill | 608 | Spain | Female | 41 | 1 | 83807.86 | 1 | 0 | 1 | 112542.58 | 0 |
| 2 | 3 | 15619304 | Onio | 502 | France | Female | 42 | 8 | 159660.80 | 3 | 1 | 0 | 113931.57 | 1 |
| 3 | 4 | 15701354 | Boni | 699 | France | Female | 39 | 1 | 0.00 | 2 | 0 | 0 | 93826.63 | 0 |
| 4 | 5 | 15737888 | Mitchell | 850 | Spain | Female | 43 | 2 | 125510.82 | 1 | 1 | 1 | 79084.10 | 0 |
# check last rows of data
data.tail()
| RowNumber | CustomerId | Surname | CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 9995 | 9996 | 15606229 | Obijiaku | 771 | France | Male | 39 | 5 | 0.00 | 2 | 1 | 0 | 96270.64 | 0 |
| 9996 | 9997 | 15569892 | Johnstone | 516 | France | Male | 35 | 10 | 57369.61 | 1 | 1 | 1 | 101699.77 | 0 |
| 9997 | 9998 | 15584532 | Liu | 709 | France | Female | 36 | 7 | 0.00 | 1 | 0 | 1 | 42085.58 | 1 |
| 9998 | 9999 | 15682355 | Sabbatini | 772 | Germany | Male | 42 | 3 | 75075.31 | 2 | 1 | 0 | 92888.52 | 1 |
| 9999 | 10000 | 15628319 | Walker | 792 | France | Female | 28 | 4 | 130142.79 | 1 | 1 | 0 | 38190.78 | 0 |
We can get a first statistical and descriptive analysis using pandas_profiling
# get pandas profiling report
pandas_profiling.ProfileReport(data)
Pandas Profiling report is showing some warnings/characteristics in the data:
# get stats for the columns
data.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| RowNumber | 10000.0 | 5.000500e+03 | 2886.895680 | 1.00 | 2500.75 | 5.000500e+03 | 7.500250e+03 | 10000.00 |
| CustomerId | 10000.0 | 1.569094e+07 | 71936.186123 | 15565701.00 | 15628528.25 | 1.569074e+07 | 1.575323e+07 | 15815690.00 |
| CreditScore | 10000.0 | 6.505288e+02 | 96.653299 | 350.00 | 584.00 | 6.520000e+02 | 7.180000e+02 | 850.00 |
| Age | 10000.0 | 3.892180e+01 | 10.487806 | 18.00 | 32.00 | 3.700000e+01 | 4.400000e+01 | 92.00 |
| Tenure | 10000.0 | 5.012800e+00 | 2.892174 | 0.00 | 3.00 | 5.000000e+00 | 7.000000e+00 | 10.00 |
| Balance | 10000.0 | 7.648589e+04 | 62397.405202 | 0.00 | 0.00 | 9.719854e+04 | 1.276442e+05 | 250898.09 |
| NumOfProducts | 10000.0 | 1.530200e+00 | 0.581654 | 1.00 | 1.00 | 1.000000e+00 | 2.000000e+00 | 4.00 |
| HasCrCard | 10000.0 | 7.055000e-01 | 0.455840 | 0.00 | 0.00 | 1.000000e+00 | 1.000000e+00 | 1.00 |
| IsActiveMember | 10000.0 | 5.151000e-01 | 0.499797 | 0.00 | 0.00 | 1.000000e+00 | 1.000000e+00 | 1.00 |
| EstimatedSalary | 10000.0 | 1.000902e+05 | 57510.492818 | 11.58 | 51002.11 | 1.001939e+05 | 1.493882e+05 | 199992.48 |
| Exited | 10000.0 | 2.037000e-01 | 0.402769 | 0.00 | 0.00 | 0.000000e+00 | 0.000000e+00 | 1.00 |
We are going to perform bivariate analysis to understand the relationship between the columns
# Continuous columns + Exited
con_col = ['CreditScore', 'Age', 'Tenure', 'Balance', 'EstimatedSalary', 'Exited']
# Pairplot for continuous columns
sns.pairplot(data[con_col], diag_kind='kde', hue='Exited');
# Get correlation matrix for numeric variables
data[con_col].corr()
| CreditScore | Age | Tenure | Balance | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|
| CreditScore | 1.000000 | -0.003965 | 0.000842 | 0.006268 | -0.001384 | -0.027094 |
| Age | -0.003965 | 1.000000 | -0.009997 | 0.028308 | -0.007201 | 0.285323 |
| Tenure | 0.000842 | -0.009997 | 1.000000 | -0.012254 | 0.007784 | -0.014001 |
| Balance | 0.006268 | 0.028308 | -0.012254 | 1.000000 | 0.012797 | 0.118533 |
| EstimatedSalary | -0.001384 | -0.007201 | 0.007784 | 0.012797 | 1.000000 | 0.012097 |
| Exited | -0.027094 | 0.285323 | -0.014001 | 0.118533 | 0.012097 | 1.000000 |
# Display correlation matrix in a heatmap
fig, ax = plt.subplots(figsize=(10,7))
sns.heatmap(data[con_col].corr(), annot=True, ax=ax);
# Create subplots for the different boxplot charts
fig, axes = plt.subplots(1, 5, figsize=(20, 5))
i = 0
for feature in con_col:
if feature != 'Exited':
# display the boxplot
sns.boxplot(data['Exited'], data[feature], ax=axes[i], )
i+=1
plt.tight_layout()
plt.show()
Observations
### Function to plot stacked bar charts for categorical columns
def stacked_plot(x,hue):
tab = 100*pd.crosstab(x,hue,normalize='index').sort_values(by=hue[0])
print(tab.T)
tab.plot(kind='bar',stacked=True)
# Plot the stacked plot
stacked_plot(data['Geography'], data['Exited'])
Geography France Spain Germany Exited 0 83.845233 83.326605 67.556796 1 16.154767 16.673395 32.443204
# Plot the stacked plot
stacked_plot(data['Gender'], data['Exited'])
Gender Male Female Exited 0 83.544072 74.928461 1 16.455928 25.071539
# Plot the stacked plot
stacked_plot(data['NumOfProducts'], data['Exited'])
NumOfProducts 2 1 3 4 Exited 0 92.418301 72.285602 17.293233 0.0 1 7.581699 27.714398 82.706767 100.0
# Plot the stacked plot
stacked_plot(data['HasCrCard'], data['Exited'])
HasCrCard 1 0 Exited 0 79.815734 79.185059 1 20.184266 20.814941
# Plot the stacked plot
stacked_plot(data['IsActiveMember'], data['Exited'])
IsActiveMember 1 0 Exited 0 85.730926 73.149103 1 14.269074 26.850897
# Drop RowNumber column
data.drop(['RowNumber'], axis=1, inplace=True)
# Drop CustomerId column
data.drop(['CustomerId'], axis=1, inplace=True)
# Drop Surname column
data.drop(['Surname'], axis=1, inplace=True)
# create independent variables
X = data.drop(['Exited'], axis=1)
# create dependent variable
y = data['Exited']
# Splitting data into training, validation and test set:
# first we split data into 2 parts, say temporary and test
X_temp, X_test, y_temp, y_test = train_test_split(X, y, test_size=0.2, random_state=1, stratify=y)
# then we split the temporary set into train and validation
X_train, X_val, y_train, y_val = train_test_split(X_temp, y_temp, test_size=0.25, random_state=1, stratify=y_temp)
print(f'Shape of Training set: {X_train.shape}')
print(f'Shape of Validation set: {X_val.shape}')
print(f'Shape of Test set: {X_test.shape}')
print(f'Percentage of classes in Training set\n{y_train.value_counts(normalize=True)}')
print(f'Percentage of classes in Validation set\n{y_val.value_counts(normalize=True)}')
print(f'Percentage of classes in Test set\n{y_test.value_counts(normalize=True)}')
Shape of Training set: (6000, 10) Shape of Validation set: (2000, 10) Shape of Test set: (2000, 10) Percentage of classes in Training set 0 0.796333 1 0.203667 Name: Exited, dtype: float64 Percentage of classes in Validation set 0 0.796 1 0.204 Name: Exited, dtype: float64 Percentage of classes in Test set 0 0.7965 1 0.2035 Name: Exited, dtype: float64
# hot encoding for categorical variables
X_train = pd.get_dummies(X_train, drop_first=True)
X_val = pd.get_dummies(X_val, drop_first=True)
X_test = pd.get_dummies(X_test, drop_first=True)
print(X_train.shape, X_val.shape, X_test.shape)
(6000, 11) (2000, 11) (2000, 11)
# Scaling the data using z-score
scaler = StandardScaler()
columns = X_train.columns
# Fit and transform the train data
X_train[columns] = scaler.fit_transform(X_train[columns])
# Transform the validation data
X_val[columns] = scaler.transform(X_val[columns])
# Transform the test data
X_test[columns] = scaler.transform(X_test[columns])
True Positives:
True Negatives:
False Positives:
False Negatives:
Recall should be used as a measured of the model performance. High recall score implies low False Negatives# Function to calculate different metric scores of the classification model
def metrics_score(model,train,test,train_y,test_y,threshold=0.5,model_name=''):
'''
Inputs:
model: classifier to predict values of X
train, test: Independent features in train and test sets
train_y,test_y: Dependent variable in train and test sets
threshold: thresold for classifiying the observation as 1
model_name: Name of the model
'''
# Make the prediction of the model in train and test sets
pred_train = (model.predict(train)>threshold)
pred_test = (model.predict(test)>threshold)
# Calculate scores and save them in a dictionary
score_dict = {'Model':model_name,
'Accuracy on training set' : accuracy_score(pred_train,train_y),
'Accuracy on validation set': accuracy_score(pred_test,test_y),
'Recall on training set': recall_score(train_y,pred_train),
'Recall on validation set': recall_score(test_y,pred_test),
'Precision on training set': precision_score(train_y,pred_train),
'Precision on validation set': precision_score(test_y,pred_test),
'F1 on training set': f1_score(train_y,pred_train),
'F1 on validation set': f1_score(test_y,pred_test)
}
# Create a dataframe with scores
scores = pd.DataFrame(score_dict,index=[0])
# return the df
return scores
# Function to display the confusion matrix
def make_confusion_matrix(model,test_X,y_actual,threshold=0.5,labels=[1, 0]):
'''
Inputs:
model : classifier to predict values of X
test_X: test set
y_actual : ground truth
threshold: thresold for classifiying the observation as 1
'''
# Make prediction of the model
y_predict = (model.predict(test_X) > threshold).astype('float')
# calculate the confusion matrix
cm = metrics.confusion_matrix( y_actual, y_predict, labels=[1,0])
# create a dataframe to display the heatmap
df_cm = pd.DataFrame(cm,
index = ['Actual-Attritied Customer','Actual-Existing Customer'],
columns = ['Predicted-Attritied Customer','Predicted-Existing Customer'])
# calculate the number of percentage of the confusion matrix for each category
group_counts = ["{0:0.0f}".format(value) for value in cm.flatten()]
group_percentages = ["{0:.1%}".format(value) for value in cm.flatten()/np.sum(cm)]
# create labels
group_labels = ['(TP)', '(FN)', '(FP)', '(TN)']
labels = [f"{v1}\n{v2}\n{v3}" for v1, v2, v3 in zip(group_counts,group_percentages,group_labels)]
labels = np.asarray(labels).reshape(2,2)
# display the confusion matrix in a heatmap
plt.figure(figsize = (8,6))
sns.heatmap(df_cm, annot=labels, fmt='')
plt.ylabel('True label')
plt.xlabel('Predicted label')
# Function to plot training results by epoch
def plot_hist(hist):
# Capturing learning history per epoch
hist = pd.DataFrame(history.history)
hist['epoch'] = history.epoch
# Plotting loss at different epochs
plt.figure(figsize=(20,5))
plt.subplot(121)
plt.title('Loss')
plt.plot(hist['loss'])
plt.plot(hist['val_loss'])
plt.legend(("train" , "valid") , loc =0);
# Plotting recall at different epochs
plt.subplot(122)
plt.title('Recall')
plt.plot(hist['recall'])
plt.plot(hist['val_recall'])
plt.legend(("train" , "valid") , loc =0);
# Function to calculate optimal threshold as per AUC-ROC curve
def optimal_threshold(X_val, y_val, model):
# The optimal cut off would be where tpr is high and fpr is low
fpr, tpr, thresholds = metrics.roc_curve(y_val, model.predict(X_val))
optimal_idx = np.argmax(tpr - fpr)
optimal_threshold_auc_roc = thresholds[optimal_idx]
print(optimal_threshold_auc_roc)
return optimal_threshold_auc_roc
# Initialize seed for random numbers
np.random.seed(1)
tf.random.set_seed(1)
# Initialize the model
model = Sequential()
## Adding layers
# Input Layer and first hidden layer
model.add(Dense(units=10, input_dim = 11, activation='elu', kernel_initializer='random_normal')) # input of 11 columns as shown above
# Hidden layer
model.add(Dense(units=10, activation='elu', kernel_initializer='random_normal'))
# Adding Dropout to prevent overfitting
model.add(Dropout(0.5))
# Output layer
model.add(Dense(1, activation='sigmoid', kernel_initializer='random_normal'))
We are going to compile the model using:
# Compile the model
model.compile(optimizer=Adam(learning_rate=0.01),loss='binary_crossentropy',metrics=['Recall'])
Summary of the model
# Summary
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense (Dense) (None, 10) 120 _________________________________________________________________ dense_1 (Dense) (None, 10) 110 _________________________________________________________________ dropout (Dropout) (None, 10) 0 _________________________________________________________________ dense_2 (Dense) (None, 1) 11 ================================================================= Total params: 241 Trainable params: 241 Non-trainable params: 0 _________________________________________________________________
Now, we are going to train the model using 100 epochs
#fitting the model
history=model.fit(X_train,y_train,batch_size=32,epochs=200,validation_split=0.2)
Epoch 1/200 150/150 [==============================] - 1s 3ms/step - loss: 0.4812 - recall: 0.1805 - val_loss: 0.4141 - val_recall: 0.1498 Epoch 2/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4368 - recall: 0.2154 - val_loss: 0.3737 - val_recall: 0.4332 Epoch 3/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3881 - recall: 0.3374 - val_loss: 0.3369 - val_recall: 0.5061 Epoch 4/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3712 - recall: 0.3897 - val_loss: 0.3353 - val_recall: 0.5263 Epoch 5/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3655 - recall: 0.4267 - val_loss: 0.3304 - val_recall: 0.5466 Epoch 6/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3686 - recall: 0.4154 - val_loss: 0.3320 - val_recall: 0.4251 Epoch 7/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3611 - recall: 0.4226 - val_loss: 0.3217 - val_recall: 0.4980 Epoch 8/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3625 - recall: 0.4144 - val_loss: 0.3257 - val_recall: 0.4899 Epoch 9/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3579 - recall: 0.4215 - val_loss: 0.3268 - val_recall: 0.5344 Epoch 10/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3572 - recall: 0.4349 - val_loss: 0.3304 - val_recall: 0.4494 Epoch 11/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3633 - recall: 0.4031 - val_loss: 0.3300 - val_recall: 0.4372 Epoch 12/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3593 - recall: 0.4174 - val_loss: 0.3286 - val_recall: 0.4777 Epoch 13/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3587 - recall: 0.4154 - val_loss: 0.3250 - val_recall: 0.4494 Epoch 14/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3575 - recall: 0.4246 - val_loss: 0.3247 - val_recall: 0.5061 Epoch 15/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3552 - recall: 0.4390 - val_loss: 0.3239 - val_recall: 0.5061 Epoch 16/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3585 - recall: 0.4236 - val_loss: 0.3393 - val_recall: 0.5061 Epoch 17/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3591 - recall: 0.4297 - val_loss: 0.3298 - val_recall: 0.5101 Epoch 18/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4318 - val_loss: 0.3263 - val_recall: 0.5101 Epoch 19/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3551 - recall: 0.4256 - val_loss: 0.3260 - val_recall: 0.5425 Epoch 20/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4338 - val_loss: 0.3404 - val_recall: 0.4656 Epoch 21/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3564 - recall: 0.4092 - val_loss: 0.3211 - val_recall: 0.5263 Epoch 22/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3542 - recall: 0.4287 - val_loss: 0.3209 - val_recall: 0.5061 Epoch 23/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3539 - recall: 0.4400 - val_loss: 0.3312 - val_recall: 0.4130 Epoch 24/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3578 - recall: 0.4226 - val_loss: 0.3305 - val_recall: 0.4251 Epoch 25/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3529 - recall: 0.4379 - val_loss: 0.3219 - val_recall: 0.4899 Epoch 26/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3503 - recall: 0.4195 - val_loss: 0.3303 - val_recall: 0.5304 Epoch 27/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3539 - recall: 0.4472 - val_loss: 0.3291 - val_recall: 0.4413 Epoch 28/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3546 - recall: 0.3938 - val_loss: 0.3233 - val_recall: 0.5506 Epoch 29/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3546 - recall: 0.4267 - val_loss: 0.3309 - val_recall: 0.5101 Epoch 30/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3568 - recall: 0.4092 - val_loss: 0.3254 - val_recall: 0.5182 Epoch 31/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3531 - recall: 0.4144 - val_loss: 0.3240 - val_recall: 0.5466 Epoch 32/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3533 - recall: 0.4297 - val_loss: 0.3284 - val_recall: 0.5061 Epoch 33/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3576 - recall: 0.4195 - val_loss: 0.3337 - val_recall: 0.4332 Epoch 34/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3503 - recall: 0.4390 - val_loss: 0.3250 - val_recall: 0.4818 Epoch 35/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3520 - recall: 0.4051 - val_loss: 0.3261 - val_recall: 0.4332 Epoch 36/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4287 - val_loss: 0.3313 - val_recall: 0.4656 Epoch 37/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3508 - recall: 0.4318 - val_loss: 0.3217 - val_recall: 0.4899 Epoch 38/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4144 - val_loss: 0.3305 - val_recall: 0.5101 Epoch 39/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3589 - recall: 0.4226 - val_loss: 0.3320 - val_recall: 0.4494 Epoch 40/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3510 - recall: 0.4287 - val_loss: 0.3291 - val_recall: 0.5020 Epoch 41/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4226 - val_loss: 0.3383 - val_recall: 0.4534 Epoch 42/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3547 - recall: 0.4195 - val_loss: 0.3288 - val_recall: 0.5182 Epoch 43/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3516 - recall: 0.4328 - val_loss: 0.3305 - val_recall: 0.4413 Epoch 44/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4308 - val_loss: 0.3248 - val_recall: 0.4899 Epoch 45/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3516 - recall: 0.4390 - val_loss: 0.3284 - val_recall: 0.4980 Epoch 46/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3459 - recall: 0.4349 - val_loss: 0.3331 - val_recall: 0.5182 Epoch 47/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3483 - recall: 0.4277 - val_loss: 0.3328 - val_recall: 0.4413 Epoch 48/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3519 - recall: 0.4349 - val_loss: 0.3268 - val_recall: 0.4777 Epoch 49/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3489 - recall: 0.4195 - val_loss: 0.3300 - val_recall: 0.4899 Epoch 50/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3563 - recall: 0.4308 - val_loss: 0.3421 - val_recall: 0.3725 Epoch 51/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3533 - recall: 0.4092 - val_loss: 0.3281 - val_recall: 0.5304 Epoch 52/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3459 - recall: 0.4369 - val_loss: 0.3279 - val_recall: 0.4939 Epoch 53/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3439 - recall: 0.4421 - val_loss: 0.3317 - val_recall: 0.5223 Epoch 54/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3518 - recall: 0.4154 - val_loss: 0.3303 - val_recall: 0.4858 Epoch 55/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3514 - recall: 0.4164 - val_loss: 0.3289 - val_recall: 0.5061 Epoch 56/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3518 - recall: 0.4267 - val_loss: 0.3297 - val_recall: 0.5263 Epoch 57/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3468 - recall: 0.4338 - val_loss: 0.3293 - val_recall: 0.5223 Epoch 58/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3526 - recall: 0.4144 - val_loss: 0.3222 - val_recall: 0.5182 Epoch 59/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3467 - recall: 0.4431 - val_loss: 0.3262 - val_recall: 0.5020 Epoch 60/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4308 - val_loss: 0.3270 - val_recall: 0.4939 Epoch 61/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3471 - recall: 0.4308 - val_loss: 0.3253 - val_recall: 0.5061 Epoch 62/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3502 - recall: 0.4267 - val_loss: 0.3277 - val_recall: 0.4615 Epoch 63/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3486 - recall: 0.4359 - val_loss: 0.3281 - val_recall: 0.4696 Epoch 64/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3421 - recall: 0.4369 - val_loss: 0.3299 - val_recall: 0.4534 Epoch 65/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3531 - recall: 0.4185 - val_loss: 0.3295 - val_recall: 0.4737 Epoch 66/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3474 - recall: 0.4390 - val_loss: 0.3272 - val_recall: 0.4858 Epoch 67/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3502 - recall: 0.4215 - val_loss: 0.3345 - val_recall: 0.4818 Epoch 68/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3506 - recall: 0.4410 - val_loss: 0.3225 - val_recall: 0.5223 Epoch 69/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3491 - recall: 0.4256 - val_loss: 0.3318 - val_recall: 0.5466 Epoch 70/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3506 - recall: 0.4533 - val_loss: 0.3281 - val_recall: 0.4777 Epoch 71/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3490 - recall: 0.4256 - val_loss: 0.3282 - val_recall: 0.5020 Epoch 72/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3494 - recall: 0.4256 - val_loss: 0.3305 - val_recall: 0.4332 Epoch 73/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3523 - recall: 0.4164 - val_loss: 0.3241 - val_recall: 0.4453 Epoch 74/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3500 - recall: 0.4133 - val_loss: 0.3351 - val_recall: 0.4696 Epoch 75/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3472 - recall: 0.4328 - val_loss: 0.3246 - val_recall: 0.5182 Epoch 76/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4164 - val_loss: 0.3267 - val_recall: 0.5304 Epoch 77/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4308 - val_loss: 0.3359 - val_recall: 0.4737 Epoch 78/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4144 - val_loss: 0.3263 - val_recall: 0.4737 Epoch 79/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3450 - recall: 0.4328 - val_loss: 0.3279 - val_recall: 0.4372 Epoch 80/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3456 - recall: 0.4328 - val_loss: 0.3285 - val_recall: 0.4939 Epoch 81/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3452 - recall: 0.4369 - val_loss: 0.3255 - val_recall: 0.5020 Epoch 82/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4164 - val_loss: 0.3253 - val_recall: 0.4980 Epoch 83/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3480 - recall: 0.4349 - val_loss: 0.3298 - val_recall: 0.4777 Epoch 84/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3456 - recall: 0.4410 - val_loss: 0.3295 - val_recall: 0.4453 Epoch 85/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3518 - recall: 0.4103 - val_loss: 0.3372 - val_recall: 0.4980 Epoch 86/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3516 - recall: 0.4174 - val_loss: 0.3309 - val_recall: 0.4777 Epoch 87/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3492 - recall: 0.4328 - val_loss: 0.3291 - val_recall: 0.4818 Epoch 88/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3481 - recall: 0.4144 - val_loss: 0.3301 - val_recall: 0.5182 Epoch 89/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3462 - recall: 0.4113 - val_loss: 0.3306 - val_recall: 0.5101 Epoch 90/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3499 - recall: 0.4082 - val_loss: 0.3266 - val_recall: 0.4777 Epoch 91/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3478 - recall: 0.4338 - val_loss: 0.3293 - val_recall: 0.4777 Epoch 92/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3490 - recall: 0.4010 - val_loss: 0.3272 - val_recall: 0.4980 Epoch 93/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4062 - val_loss: 0.3295 - val_recall: 0.4777 Epoch 94/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3459 - recall: 0.4369 - val_loss: 0.3314 - val_recall: 0.4615 Epoch 95/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3455 - recall: 0.4379 - val_loss: 0.3332 - val_recall: 0.4170 Epoch 96/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3512 - recall: 0.4154 - val_loss: 0.3311 - val_recall: 0.4737 Epoch 97/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3489 - recall: 0.4174 - val_loss: 0.3312 - val_recall: 0.4575 Epoch 98/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3432 - recall: 0.4267 - val_loss: 0.3270 - val_recall: 0.4818 Epoch 99/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4185 - val_loss: 0.3221 - val_recall: 0.4777 Epoch 100/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4267 - val_loss: 0.3275 - val_recall: 0.4858 Epoch 101/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3417 - recall: 0.4564 - val_loss: 0.3295 - val_recall: 0.4453 Epoch 102/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3484 - recall: 0.4277 - val_loss: 0.3325 - val_recall: 0.4170 Epoch 103/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3477 - recall: 0.4287 - val_loss: 0.3344 - val_recall: 0.4615 Epoch 104/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3481 - recall: 0.4390 - val_loss: 0.3306 - val_recall: 0.4939 Epoch 105/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4256 - val_loss: 0.3311 - val_recall: 0.4858 Epoch 106/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3492 - recall: 0.4308 - val_loss: 0.3346 - val_recall: 0.5101 Epoch 107/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3487 - recall: 0.4369 - val_loss: 0.3285 - val_recall: 0.4737 Epoch 108/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3486 - recall: 0.4103 - val_loss: 0.3329 - val_recall: 0.4575 Epoch 109/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3498 - recall: 0.4154 - val_loss: 0.3293 - val_recall: 0.4656 Epoch 110/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3431 - recall: 0.4226 - val_loss: 0.3332 - val_recall: 0.4737 Epoch 111/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3473 - recall: 0.4441 - val_loss: 0.3288 - val_recall: 0.4899 Epoch 112/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4277 - val_loss: 0.3283 - val_recall: 0.4777 Epoch 113/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3441 - recall: 0.4215 - val_loss: 0.3324 - val_recall: 0.5506 Epoch 114/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3446 - recall: 0.4523 - val_loss: 0.3323 - val_recall: 0.4818 Epoch 115/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4072 - val_loss: 0.3321 - val_recall: 0.5466 Epoch 116/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3478 - recall: 0.4379 - val_loss: 0.3349 - val_recall: 0.5344 Epoch 117/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3475 - recall: 0.4195 - val_loss: 0.3327 - val_recall: 0.4696 Epoch 118/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3457 - recall: 0.4277 - val_loss: 0.3301 - val_recall: 0.4656 Epoch 119/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3453 - recall: 0.4256 - val_loss: 0.3235 - val_recall: 0.5020 Epoch 120/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3432 - recall: 0.4297 - val_loss: 0.3279 - val_recall: 0.5344 Epoch 121/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3431 - recall: 0.4154 - val_loss: 0.3339 - val_recall: 0.4737 Epoch 122/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3497 - recall: 0.4236 - val_loss: 0.3264 - val_recall: 0.5061 Epoch 123/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3456 - recall: 0.4123 - val_loss: 0.3289 - val_recall: 0.4737 Epoch 124/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3417 - recall: 0.4462 - val_loss: 0.3359 - val_recall: 0.4696 Epoch 125/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3431 - recall: 0.4236 - val_loss: 0.3281 - val_recall: 0.5263 Epoch 126/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3421 - recall: 0.4359 - val_loss: 0.3310 - val_recall: 0.4980 Epoch 127/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3468 - recall: 0.4267 - val_loss: 0.3307 - val_recall: 0.4737 Epoch 128/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3484 - recall: 0.4287 - val_loss: 0.3279 - val_recall: 0.4696 Epoch 129/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3454 - recall: 0.4338 - val_loss: 0.3295 - val_recall: 0.4818 Epoch 130/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3463 - recall: 0.4041 - val_loss: 0.3314 - val_recall: 0.4899 Epoch 131/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3405 - recall: 0.4410 - val_loss: 0.3344 - val_recall: 0.4413 Epoch 132/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3431 - recall: 0.4256 - val_loss: 0.3319 - val_recall: 0.4939 Epoch 133/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3436 - recall: 0.4421 - val_loss: 0.3318 - val_recall: 0.5385 Epoch 134/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3394 - recall: 0.4318 - val_loss: 0.3317 - val_recall: 0.4858 Epoch 135/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3436 - recall: 0.4287 - val_loss: 0.3323 - val_recall: 0.4534 Epoch 136/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3457 - recall: 0.4205 - val_loss: 0.3320 - val_recall: 0.5020 Epoch 137/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3431 - recall: 0.4297 - val_loss: 0.3322 - val_recall: 0.4737 Epoch 138/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3469 - recall: 0.4164 - val_loss: 0.3320 - val_recall: 0.4899 Epoch 139/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3429 - recall: 0.4359 - val_loss: 0.3314 - val_recall: 0.4858 Epoch 140/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3447 - recall: 0.4349 - val_loss: 0.3370 - val_recall: 0.4615 Epoch 141/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3433 - recall: 0.4277 - val_loss: 0.3311 - val_recall: 0.4615 Epoch 142/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3396 - recall: 0.4318 - val_loss: 0.3327 - val_recall: 0.4575 Epoch 143/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3444 - recall: 0.4267 - val_loss: 0.3294 - val_recall: 0.5142 Epoch 144/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3433 - recall: 0.4308 - val_loss: 0.3320 - val_recall: 0.4899 Epoch 145/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4154 - val_loss: 0.3337 - val_recall: 0.4939 Epoch 146/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3408 - recall: 0.4328 - val_loss: 0.3294 - val_recall: 0.5182 Epoch 147/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3413 - recall: 0.4287 - val_loss: 0.3325 - val_recall: 0.5061 Epoch 148/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3441 - recall: 0.4174 - val_loss: 0.3302 - val_recall: 0.4777 Epoch 149/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3425 - recall: 0.4205 - val_loss: 0.3265 - val_recall: 0.4899 Epoch 150/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3393 - recall: 0.4421 - val_loss: 0.3351 - val_recall: 0.4615 Epoch 151/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3415 - recall: 0.4338 - val_loss: 0.3380 - val_recall: 0.4453 Epoch 152/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3446 - recall: 0.4297 - val_loss: 0.3319 - val_recall: 0.4575 Epoch 153/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3466 - recall: 0.4174 - val_loss: 0.3304 - val_recall: 0.4858 Epoch 154/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3404 - recall: 0.4328 - val_loss: 0.3356 - val_recall: 0.4777 Epoch 155/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3437 - recall: 0.4359 - val_loss: 0.3320 - val_recall: 0.4534 Epoch 156/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3401 - recall: 0.4369 - val_loss: 0.3340 - val_recall: 0.5020 Epoch 157/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3412 - recall: 0.4410 - val_loss: 0.3358 - val_recall: 0.4332 Epoch 158/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3409 - recall: 0.4226 - val_loss: 0.3317 - val_recall: 0.5142 Epoch 159/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3408 - recall: 0.4410 - val_loss: 0.3355 - val_recall: 0.4615 Epoch 160/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3432 - recall: 0.4277 - val_loss: 0.3324 - val_recall: 0.4980 Epoch 161/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3395 - recall: 0.4492 - val_loss: 0.3392 - val_recall: 0.4818 Epoch 162/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3435 - recall: 0.4174 - val_loss: 0.3311 - val_recall: 0.5223 Epoch 163/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4308 - val_loss: 0.3365 - val_recall: 0.4737 Epoch 164/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3415 - recall: 0.4277 - val_loss: 0.3345 - val_recall: 0.4939 Epoch 165/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3482 - recall: 0.4123 - val_loss: 0.3360 - val_recall: 0.4413 Epoch 166/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3440 - recall: 0.4287 - val_loss: 0.3352 - val_recall: 0.4615 Epoch 167/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3503 - recall: 0.4051 - val_loss: 0.3309 - val_recall: 0.4656 Epoch 168/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3420 - recall: 0.4051 - val_loss: 0.3381 - val_recall: 0.5101 Epoch 169/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3433 - recall: 0.4215 - val_loss: 0.3326 - val_recall: 0.4858 Epoch 170/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3444 - recall: 0.4287 - val_loss: 0.3431 - val_recall: 0.4696 Epoch 171/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3449 - recall: 0.4226 - val_loss: 0.3406 - val_recall: 0.4494 Epoch 172/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3396 - recall: 0.4174 - val_loss: 0.3330 - val_recall: 0.5020 Epoch 173/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3394 - recall: 0.4472 - val_loss: 0.3342 - val_recall: 0.4534 Epoch 174/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3427 - recall: 0.4246 - val_loss: 0.3294 - val_recall: 0.4615 Epoch 175/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3448 - recall: 0.4226 - val_loss: 0.3323 - val_recall: 0.4575 Epoch 176/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3501 - recall: 0.3908 - val_loss: 0.3338 - val_recall: 0.5061 Epoch 177/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3421 - recall: 0.4185 - val_loss: 0.3391 - val_recall: 0.4656 Epoch 178/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3428 - recall: 0.4164 - val_loss: 0.3322 - val_recall: 0.4696 Epoch 179/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3375 - recall: 0.4328 - val_loss: 0.3381 - val_recall: 0.4939 Epoch 180/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3438 - recall: 0.4113 - val_loss: 0.3378 - val_recall: 0.4980 Epoch 181/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3383 - recall: 0.4523 - val_loss: 0.3334 - val_recall: 0.4737 Epoch 182/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3425 - recall: 0.4185 - val_loss: 0.3396 - val_recall: 0.4858 Epoch 183/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3451 - recall: 0.4328 - val_loss: 0.3441 - val_recall: 0.4130 Epoch 184/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3389 - recall: 0.4359 - val_loss: 0.3356 - val_recall: 0.4899 Epoch 185/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3411 - recall: 0.4338 - val_loss: 0.3373 - val_recall: 0.4980 Epoch 186/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3422 - recall: 0.4287 - val_loss: 0.3356 - val_recall: 0.4494 Epoch 187/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3452 - recall: 0.4195 - val_loss: 0.3296 - val_recall: 0.5061 Epoch 188/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3410 - recall: 0.4195 - val_loss: 0.3290 - val_recall: 0.4939 Epoch 189/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3415 - recall: 0.4185 - val_loss: 0.3375 - val_recall: 0.4656 Epoch 190/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3466 - recall: 0.4256 - val_loss: 0.3411 - val_recall: 0.4413 Epoch 191/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3434 - recall: 0.4215 - val_loss: 0.3410 - val_recall: 0.4696 Epoch 192/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3436 - recall: 0.4195 - val_loss: 0.3411 - val_recall: 0.4211 Epoch 193/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3411 - recall: 0.4297 - val_loss: 0.3402 - val_recall: 0.5101 Epoch 194/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3468 - recall: 0.4267 - val_loss: 0.3348 - val_recall: 0.4939 Epoch 195/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3394 - recall: 0.4277 - val_loss: 0.3320 - val_recall: 0.4939 Epoch 196/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3443 - recall: 0.4164 - val_loss: 0.3399 - val_recall: 0.4737 Epoch 197/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3435 - recall: 0.4338 - val_loss: 0.3365 - val_recall: 0.4696 Epoch 198/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3450 - recall: 0.4092 - val_loss: 0.3359 - val_recall: 0.5061 Epoch 199/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3374 - recall: 0.4369 - val_loss: 0.3444 - val_recall: 0.4737 Epoch 200/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3437 - recall: 0.4154 - val_loss: 0.3342 - val_recall: 0.4696
# plot loss and recall
plot_hist(history)
Observation
Now, we are going to analyze the ROC-AUC in the training and validation sets
Training set
roc_auc_train = roc_auc_score(y_train, model.predict(X_train))
fpr, tpr, thresholds = roc_curve(y_train, model.predict(X_train))
plt.figure(figsize=(7,5))
plt.plot(fpr, tpr, label='Neural Network (area = %0.2f)' % roc_auc_train)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
Validation set
roc_auc_train = roc_auc_score(y_val, model.predict(X_val))
fpr, tpr, thresholds = roc_curve(y_val, model.predict(X_val))
plt.figure(figsize=(7,5))
plt.plot(fpr, tpr, label='Neural Network (area = %0.2f)' % roc_auc_train)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.show()
Determine the optimal threshold from AUC-ROC curve
# determine optimal threshold
optimal_threshold_auc_roc_1 = optimal_threshold(X_val, y_val, model)
0.21450984
Now, we are going to calculate the different metrics and evaluate the performance of the model in train and validation sets
# checking model performances for this model in train and validation sets
scores_nn = metrics_score(model,X_train,X_val,y_train,y_val,threshold=optimal_threshold_auc_roc_1,model_name='AUC-ROC')
scores_nn
| Model | Accuracy on training set | Accuracy on validation set | Recall on training set | Recall on validation set | Precision on training set | Precision on validation set | F1 on training set | F1 on validation set | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | AUC-ROC | 0.778 | 0.783 | 0.795417 | 0.767157 | 0.473223 | 0.480061 | 0.593407 | 0.590566 |
# creating confusion matrix
make_confusion_matrix(model,X_val,y_val, threshold=optimal_threshold_auc_roc_1)
Observations
We are going to try different strategies to improve the model performance:
Early Stopping Callback
We will use early stopping callback in each model
# Creating earlystopping callback
early_stopping = tf.keras.callbacks.EarlyStopping(monitor='val_loss',
min_delta=0,
patience=20,
verbose=0, mode='min',
restore_best_weights= True)
# Initialize seed for random numbers
np.random.seed(1)
tf.random.set_seed(1)
# Create a copy of the model (with freshly initialized weights)
model_2 = tf.keras.models.clone_model(model)
# change learning rate to 0.0005
model_2.compile(optimizer=Adam(learning_rate=0.0005),loss='binary_crossentropy',metrics=['Recall'])
#fitting the model
history = model_2.fit(X_train,y_train,batch_size=32,epochs=200,validation_split=0.2,callbacks=[early_stopping])
Epoch 1/200 150/150 [==============================] - 1s 2ms/step - loss: 0.6648 - recall: 0.0297 - val_loss: 0.6102 - val_recall: 0.1053 Epoch 2/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5326 - recall: 0.1477 - val_loss: 0.4514 - val_recall: 0.2146 Epoch 3/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4572 - recall: 0.1805 - val_loss: 0.4185 - val_recall: 0.2267 Epoch 4/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4490 - recall: 0.1826 - val_loss: 0.4134 - val_recall: 0.2470 Epoch 5/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4473 - recall: 0.2000 - val_loss: 0.4119 - val_recall: 0.2429 Epoch 6/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4493 - recall: 0.2041 - val_loss: 0.4120 - val_recall: 0.2551 Epoch 7/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4481 - recall: 0.2021 - val_loss: 0.4106 - val_recall: 0.2632 Epoch 8/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4458 - recall: 0.2051 - val_loss: 0.4103 - val_recall: 0.2713 Epoch 9/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4464 - recall: 0.2051 - val_loss: 0.4100 - val_recall: 0.2713 Epoch 10/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4459 - recall: 0.2062 - val_loss: 0.4089 - val_recall: 0.2632 Epoch 11/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4488 - recall: 0.2051 - val_loss: 0.4081 - val_recall: 0.2510 Epoch 12/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4444 - recall: 0.1908 - val_loss: 0.4071 - val_recall: 0.2591 Epoch 13/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4416 - recall: 0.2041 - val_loss: 0.4058 - val_recall: 0.2591 Epoch 14/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4420 - recall: 0.2000 - val_loss: 0.4043 - val_recall: 0.2713 Epoch 15/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4407 - recall: 0.2297 - val_loss: 0.4037 - val_recall: 0.2753 Epoch 16/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4429 - recall: 0.2195 - val_loss: 0.4018 - val_recall: 0.2672 Epoch 17/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4380 - recall: 0.2164 - val_loss: 0.4002 - val_recall: 0.2591 Epoch 18/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4340 - recall: 0.2267 - val_loss: 0.3969 - val_recall: 0.2915 Epoch 19/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4329 - recall: 0.2277 - val_loss: 0.3943 - val_recall: 0.2713 Epoch 20/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4282 - recall: 0.2359 - val_loss: 0.3918 - val_recall: 0.3117 Epoch 21/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4260 - recall: 0.2492 - val_loss: 0.3888 - val_recall: 0.2955 Epoch 22/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4232 - recall: 0.2554 - val_loss: 0.3860 - val_recall: 0.3198 Epoch 23/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4198 - recall: 0.2667 - val_loss: 0.3819 - val_recall: 0.2996 Epoch 24/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4192 - recall: 0.2585 - val_loss: 0.3784 - val_recall: 0.3198 Epoch 25/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4139 - recall: 0.2790 - val_loss: 0.3750 - val_recall: 0.3441 Epoch 26/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4114 - recall: 0.2738 - val_loss: 0.3722 - val_recall: 0.3522 Epoch 27/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4089 - recall: 0.2954 - val_loss: 0.3674 - val_recall: 0.3684 Epoch 28/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3990 - recall: 0.2944 - val_loss: 0.3620 - val_recall: 0.3765 Epoch 29/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3975 - recall: 0.3210 - val_loss: 0.3580 - val_recall: 0.3927 Epoch 30/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3900 - recall: 0.3221 - val_loss: 0.3521 - val_recall: 0.3846 Epoch 31/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3848 - recall: 0.3405 - val_loss: 0.3491 - val_recall: 0.4291 Epoch 32/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3792 - recall: 0.3549 - val_loss: 0.3441 - val_recall: 0.4170 Epoch 33/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3788 - recall: 0.3641 - val_loss: 0.3416 - val_recall: 0.3968 Epoch 34/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3727 - recall: 0.3590 - val_loss: 0.3383 - val_recall: 0.4211 Epoch 35/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3704 - recall: 0.3836 - val_loss: 0.3362 - val_recall: 0.4170 Epoch 36/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3661 - recall: 0.3764 - val_loss: 0.3314 - val_recall: 0.4372 Epoch 37/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3648 - recall: 0.3928 - val_loss: 0.3294 - val_recall: 0.4777 Epoch 38/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3613 - recall: 0.4010 - val_loss: 0.3295 - val_recall: 0.4575 Epoch 39/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3676 - recall: 0.4072 - val_loss: 0.3298 - val_recall: 0.4453 Epoch 40/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3625 - recall: 0.4092 - val_loss: 0.3273 - val_recall: 0.4696 Epoch 41/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3573 - recall: 0.4103 - val_loss: 0.3263 - val_recall: 0.4899 Epoch 42/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3616 - recall: 0.4195 - val_loss: 0.3267 - val_recall: 0.4858 Epoch 43/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3574 - recall: 0.4328 - val_loss: 0.3237 - val_recall: 0.4939 Epoch 44/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3581 - recall: 0.4359 - val_loss: 0.3240 - val_recall: 0.4777 Epoch 45/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3546 - recall: 0.4328 - val_loss: 0.3241 - val_recall: 0.5020 Epoch 46/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3553 - recall: 0.4277 - val_loss: 0.3241 - val_recall: 0.5020 Epoch 47/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3557 - recall: 0.4318 - val_loss: 0.3224 - val_recall: 0.4818 Epoch 48/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3555 - recall: 0.4287 - val_loss: 0.3216 - val_recall: 0.4980 Epoch 49/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3550 - recall: 0.4410 - val_loss: 0.3243 - val_recall: 0.4980 Epoch 50/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3573 - recall: 0.4410 - val_loss: 0.3212 - val_recall: 0.4858 Epoch 51/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3570 - recall: 0.4246 - val_loss: 0.3227 - val_recall: 0.5020 Epoch 52/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3533 - recall: 0.4431 - val_loss: 0.3230 - val_recall: 0.5142 Epoch 53/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3524 - recall: 0.4369 - val_loss: 0.3232 - val_recall: 0.5020 Epoch 54/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3560 - recall: 0.4246 - val_loss: 0.3236 - val_recall: 0.5182 Epoch 55/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3547 - recall: 0.4379 - val_loss: 0.3218 - val_recall: 0.5020 Epoch 56/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3541 - recall: 0.4379 - val_loss: 0.3238 - val_recall: 0.5182 Epoch 57/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3521 - recall: 0.4328 - val_loss: 0.3226 - val_recall: 0.5223 Epoch 58/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3543 - recall: 0.4297 - val_loss: 0.3220 - val_recall: 0.5061 Epoch 59/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3522 - recall: 0.4431 - val_loss: 0.3224 - val_recall: 0.4980 Epoch 60/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3553 - recall: 0.4410 - val_loss: 0.3221 - val_recall: 0.5061 Epoch 61/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4369 - val_loss: 0.3219 - val_recall: 0.5304 Epoch 62/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3531 - recall: 0.4523 - val_loss: 0.3213 - val_recall: 0.5142 Epoch 63/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3528 - recall: 0.4574 - val_loss: 0.3205 - val_recall: 0.5101 Epoch 64/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3473 - recall: 0.4431 - val_loss: 0.3226 - val_recall: 0.5061 Epoch 65/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3549 - recall: 0.4328 - val_loss: 0.3211 - val_recall: 0.5142 Epoch 66/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4574 - val_loss: 0.3207 - val_recall: 0.5101 Epoch 67/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3545 - recall: 0.4585 - val_loss: 0.3236 - val_recall: 0.4980 Epoch 68/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3530 - recall: 0.4318 - val_loss: 0.3204 - val_recall: 0.5182 Epoch 69/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3513 - recall: 0.4410 - val_loss: 0.3231 - val_recall: 0.5182 Epoch 70/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3551 - recall: 0.4564 - val_loss: 0.3206 - val_recall: 0.5020 Epoch 71/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4349 - val_loss: 0.3226 - val_recall: 0.5142 Epoch 72/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3529 - recall: 0.4379 - val_loss: 0.3192 - val_recall: 0.5061 Epoch 73/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3571 - recall: 0.4369 - val_loss: 0.3209 - val_recall: 0.4939 Epoch 74/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3516 - recall: 0.4164 - val_loss: 0.3211 - val_recall: 0.4939 Epoch 75/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3491 - recall: 0.4338 - val_loss: 0.3210 - val_recall: 0.5020 Epoch 76/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3514 - recall: 0.4318 - val_loss: 0.3213 - val_recall: 0.5061 Epoch 77/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3491 - recall: 0.4349 - val_loss: 0.3230 - val_recall: 0.5101 Epoch 78/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3497 - recall: 0.4287 - val_loss: 0.3210 - val_recall: 0.4980 Epoch 79/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4379 - val_loss: 0.3203 - val_recall: 0.5061 Epoch 80/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3500 - recall: 0.4431 - val_loss: 0.3218 - val_recall: 0.4939 Epoch 81/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3444 - recall: 0.4523 - val_loss: 0.3212 - val_recall: 0.5101 Epoch 82/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3490 - recall: 0.4431 - val_loss: 0.3201 - val_recall: 0.5101 Epoch 83/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4441 - val_loss: 0.3215 - val_recall: 0.5142 Epoch 84/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3474 - recall: 0.4533 - val_loss: 0.3206 - val_recall: 0.4939 Epoch 85/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3498 - recall: 0.4441 - val_loss: 0.3221 - val_recall: 0.5101 Epoch 86/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4482 - val_loss: 0.3218 - val_recall: 0.5061 Epoch 87/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3481 - recall: 0.4441 - val_loss: 0.3204 - val_recall: 0.5101 Epoch 88/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3484 - recall: 0.4390 - val_loss: 0.3213 - val_recall: 0.5020 Epoch 89/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3467 - recall: 0.4369 - val_loss: 0.3216 - val_recall: 0.5061 Epoch 90/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3500 - recall: 0.4400 - val_loss: 0.3207 - val_recall: 0.5061 Epoch 91/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3495 - recall: 0.4441 - val_loss: 0.3216 - val_recall: 0.5101 Epoch 92/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4103 - val_loss: 0.3216 - val_recall: 0.4980
# plot loss and recall
plot_hist(history)
# determine optimal threshold
optimal_threshold_auc_roc_2 = optimal_threshold(X_val, y_val, model_2)
0.18675604
# checking model performances for this model in train and validation sets
scores_nn2 = metrics_score(model_2,X_train,X_val,y_train,y_val,threshold=optimal_threshold_auc_roc_2,model_name='Model 2')
scores_nn2
| Model | Accuracy on training set | Accuracy on validation set | Recall on training set | Recall on validation set | Precision on training set | Precision on validation set | F1 on training set | F1 on validation set | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | Model 2 | 0.781667 | 0.7875 | 0.767594 | 0.77451 | 0.477597 | 0.486903 | 0.588826 | 0.597919 |
# Initialize seed for random numbers
np.random.seed(1)
tf.random.set_seed(1)
#initialize the model
model_3 = Sequential()
## Adding layers
# Input Layer and first hidden layer
model_3.add(Dense(units=10, input_dim = 11,activation='elu')) # input of 11 columns as shown above
# Hidden layers
model_3.add(Dense(units=10,activation='elu'))
# Adding Dropout to prevent overfitting
model_3.add(Dropout(0.5))
model_3.add(Dense(25,activation='elu'))
model_3.add(Dense(25,activation='elu'))
# Output layer
model_3.add(Dense(1,activation='sigmoid'))
# Compile the model
model_3.compile(optimizer=Adam(learning_rate=0.0005),loss='binary_crossentropy',metrics=['Recall'])
# Summary
model_3.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_3 (Dense) (None, 10) 120 _________________________________________________________________ dense_4 (Dense) (None, 10) 110 _________________________________________________________________ dropout_1 (Dropout) (None, 10) 0 _________________________________________________________________ dense_5 (Dense) (None, 25) 275 _________________________________________________________________ dense_6 (Dense) (None, 25) 650 _________________________________________________________________ dense_7 (Dense) (None, 1) 26 ================================================================= Total params: 1,181 Trainable params: 1,181 Non-trainable params: 0 _________________________________________________________________
#fitting the model
history=model_3.fit(X_train,y_train,batch_size=32,epochs=200,validation_split=0.2,callbacks=[early_stopping])
Epoch 1/200 150/150 [==============================] - 1s 3ms/step - loss: 0.5892 - recall: 0.1979 - val_loss: 0.4860 - val_recall: 0.0040 Epoch 2/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4906 - recall: 0.0554 - val_loss: 0.4452 - val_recall: 0.0972 Epoch 3/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4634 - recall: 0.0821 - val_loss: 0.4257 - val_recall: 0.1174 Epoch 4/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4602 - recall: 0.0831 - val_loss: 0.4167 - val_recall: 0.1700 Epoch 5/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4515 - recall: 0.1149 - val_loss: 0.4093 - val_recall: 0.2105 Epoch 6/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4453 - recall: 0.1559 - val_loss: 0.4032 - val_recall: 0.2227 Epoch 7/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4436 - recall: 0.1600 - val_loss: 0.4026 - val_recall: 0.1943 Epoch 8/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4385 - recall: 0.1621 - val_loss: 0.3968 - val_recall: 0.2348 Epoch 9/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4346 - recall: 0.1774 - val_loss: 0.3950 - val_recall: 0.2834 Epoch 10/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4291 - recall: 0.1979 - val_loss: 0.3902 - val_recall: 0.2753 Epoch 11/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4326 - recall: 0.1867 - val_loss: 0.3879 - val_recall: 0.2672 Epoch 12/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4250 - recall: 0.2133 - val_loss: 0.3844 - val_recall: 0.3279 Epoch 13/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4229 - recall: 0.2400 - val_loss: 0.3817 - val_recall: 0.3198 Epoch 14/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4220 - recall: 0.2359 - val_loss: 0.3781 - val_recall: 0.3765 Epoch 15/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4172 - recall: 0.2523 - val_loss: 0.3780 - val_recall: 0.3239 Epoch 16/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4166 - recall: 0.2308 - val_loss: 0.3708 - val_recall: 0.3644 Epoch 17/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4139 - recall: 0.2544 - val_loss: 0.3709 - val_recall: 0.3765 Epoch 18/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4089 - recall: 0.2831 - val_loss: 0.3668 - val_recall: 0.3887 Epoch 19/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4102 - recall: 0.2554 - val_loss: 0.3609 - val_recall: 0.4049 Epoch 20/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4049 - recall: 0.2933 - val_loss: 0.3611 - val_recall: 0.4372 Epoch 21/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4010 - recall: 0.2954 - val_loss: 0.3566 - val_recall: 0.4049 Epoch 22/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3974 - recall: 0.3231 - val_loss: 0.3541 - val_recall: 0.4615 Epoch 23/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3980 - recall: 0.3251 - val_loss: 0.3535 - val_recall: 0.3927 Epoch 24/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3945 - recall: 0.3149 - val_loss: 0.3493 - val_recall: 0.4575 Epoch 25/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3892 - recall: 0.3538 - val_loss: 0.3469 - val_recall: 0.4575 Epoch 26/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3941 - recall: 0.3292 - val_loss: 0.3501 - val_recall: 0.4332 Epoch 27/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3866 - recall: 0.3497 - val_loss: 0.3450 - val_recall: 0.4413 Epoch 28/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3828 - recall: 0.3579 - val_loss: 0.3427 - val_recall: 0.4656 Epoch 29/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3859 - recall: 0.3590 - val_loss: 0.3411 - val_recall: 0.4696 Epoch 30/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3792 - recall: 0.3508 - val_loss: 0.3362 - val_recall: 0.4777 Epoch 31/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3783 - recall: 0.3744 - val_loss: 0.3369 - val_recall: 0.4899 Epoch 32/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3724 - recall: 0.3887 - val_loss: 0.3351 - val_recall: 0.4858 Epoch 33/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3798 - recall: 0.3795 - val_loss: 0.3342 - val_recall: 0.4696 Epoch 34/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3765 - recall: 0.3610 - val_loss: 0.3336 - val_recall: 0.4777 Epoch 35/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3715 - recall: 0.3846 - val_loss: 0.3322 - val_recall: 0.4575 Epoch 36/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3701 - recall: 0.3805 - val_loss: 0.3310 - val_recall: 0.4696 Epoch 37/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3690 - recall: 0.3897 - val_loss: 0.3293 - val_recall: 0.5223 Epoch 38/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3694 - recall: 0.4031 - val_loss: 0.3299 - val_recall: 0.4980 Epoch 39/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3702 - recall: 0.4092 - val_loss: 0.3299 - val_recall: 0.4777 Epoch 40/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3672 - recall: 0.4174 - val_loss: 0.3287 - val_recall: 0.4939 Epoch 41/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3651 - recall: 0.4062 - val_loss: 0.3291 - val_recall: 0.4939 Epoch 42/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3657 - recall: 0.4031 - val_loss: 0.3280 - val_recall: 0.4980 Epoch 43/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3647 - recall: 0.4195 - val_loss: 0.3261 - val_recall: 0.5061 Epoch 44/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3654 - recall: 0.4297 - val_loss: 0.3296 - val_recall: 0.4777 Epoch 45/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3620 - recall: 0.4236 - val_loss: 0.3259 - val_recall: 0.5182 Epoch 46/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3618 - recall: 0.4246 - val_loss: 0.3250 - val_recall: 0.5061 Epoch 47/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3644 - recall: 0.4236 - val_loss: 0.3253 - val_recall: 0.4858 Epoch 48/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3610 - recall: 0.4113 - val_loss: 0.3240 - val_recall: 0.4899 Epoch 49/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3623 - recall: 0.4123 - val_loss: 0.3261 - val_recall: 0.5182 Epoch 50/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3649 - recall: 0.4174 - val_loss: 0.3235 - val_recall: 0.4818 Epoch 51/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3620 - recall: 0.4051 - val_loss: 0.3239 - val_recall: 0.4939 Epoch 52/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3567 - recall: 0.4308 - val_loss: 0.3244 - val_recall: 0.5101 Epoch 53/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3566 - recall: 0.4297 - val_loss: 0.3234 - val_recall: 0.4980 Epoch 54/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3590 - recall: 0.4205 - val_loss: 0.3256 - val_recall: 0.5101 Epoch 55/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3540 - recall: 0.4308 - val_loss: 0.3221 - val_recall: 0.5061 Epoch 56/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3591 - recall: 0.4092 - val_loss: 0.3250 - val_recall: 0.5101 Epoch 57/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3547 - recall: 0.4318 - val_loss: 0.3254 - val_recall: 0.5263 Epoch 58/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3544 - recall: 0.4256 - val_loss: 0.3235 - val_recall: 0.5142 Epoch 59/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3581 - recall: 0.4308 - val_loss: 0.3240 - val_recall: 0.4980 Epoch 60/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3546 - recall: 0.4246 - val_loss: 0.3227 - val_recall: 0.4980 Epoch 61/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3532 - recall: 0.4338 - val_loss: 0.3255 - val_recall: 0.5182 Epoch 62/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3567 - recall: 0.4205 - val_loss: 0.3234 - val_recall: 0.4980 Epoch 63/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3513 - recall: 0.4318 - val_loss: 0.3223 - val_recall: 0.5142 Epoch 64/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3515 - recall: 0.4287 - val_loss: 0.3248 - val_recall: 0.5020 Epoch 65/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3541 - recall: 0.4328 - val_loss: 0.3233 - val_recall: 0.5182 Epoch 66/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4472 - val_loss: 0.3220 - val_recall: 0.5061 Epoch 67/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3514 - recall: 0.4472 - val_loss: 0.3227 - val_recall: 0.4980 Epoch 68/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3530 - recall: 0.4328 - val_loss: 0.3232 - val_recall: 0.5385 Epoch 69/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3513 - recall: 0.4369 - val_loss: 0.3224 - val_recall: 0.5182 Epoch 70/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3553 - recall: 0.4451 - val_loss: 0.3201 - val_recall: 0.5101 Epoch 71/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4441 - val_loss: 0.3231 - val_recall: 0.5344 Epoch 72/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4523 - val_loss: 0.3199 - val_recall: 0.5101 Epoch 73/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3569 - recall: 0.4287 - val_loss: 0.3222 - val_recall: 0.4899 Epoch 74/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3525 - recall: 0.4256 - val_loss: 0.3216 - val_recall: 0.4899 Epoch 75/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4585 - val_loss: 0.3233 - val_recall: 0.5344 Epoch 76/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3551 - recall: 0.4390 - val_loss: 0.3224 - val_recall: 0.5061 Epoch 77/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3492 - recall: 0.4462 - val_loss: 0.3218 - val_recall: 0.5101 Epoch 78/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3533 - recall: 0.4267 - val_loss: 0.3218 - val_recall: 0.5020 Epoch 79/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3476 - recall: 0.4451 - val_loss: 0.3208 - val_recall: 0.5061 Epoch 80/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3467 - recall: 0.4595 - val_loss: 0.3209 - val_recall: 0.4980 Epoch 81/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3424 - recall: 0.4513 - val_loss: 0.3206 - val_recall: 0.5304 Epoch 82/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3508 - recall: 0.4441 - val_loss: 0.3201 - val_recall: 0.5061 Epoch 83/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3461 - recall: 0.4441 - val_loss: 0.3209 - val_recall: 0.5263 Epoch 84/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3452 - recall: 0.4667 - val_loss: 0.3199 - val_recall: 0.5101 Epoch 85/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3483 - recall: 0.4441 - val_loss: 0.3202 - val_recall: 0.5020 Epoch 86/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3458 - recall: 0.4523 - val_loss: 0.3207 - val_recall: 0.5101 Epoch 87/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3490 - recall: 0.4482 - val_loss: 0.3210 - val_recall: 0.5061 Epoch 88/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3470 - recall: 0.4308 - val_loss: 0.3210 - val_recall: 0.5304 Epoch 89/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4451 - val_loss: 0.3208 - val_recall: 0.5182 Epoch 90/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3499 - recall: 0.4349 - val_loss: 0.3191 - val_recall: 0.4939 Epoch 91/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4595 - val_loss: 0.3200 - val_recall: 0.5061 Epoch 92/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3495 - recall: 0.4379 - val_loss: 0.3204 - val_recall: 0.4939 Epoch 93/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3535 - recall: 0.4267 - val_loss: 0.3202 - val_recall: 0.5263 Epoch 94/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3472 - recall: 0.4349 - val_loss: 0.3191 - val_recall: 0.5020 Epoch 95/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3469 - recall: 0.4605 - val_loss: 0.3195 - val_recall: 0.4696 Epoch 96/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3483 - recall: 0.4379 - val_loss: 0.3193 - val_recall: 0.4980 Epoch 97/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3490 - recall: 0.4492 - val_loss: 0.3208 - val_recall: 0.4980 Epoch 98/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3500 - recall: 0.4410 - val_loss: 0.3205 - val_recall: 0.5263 Epoch 99/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3499 - recall: 0.4338 - val_loss: 0.3203 - val_recall: 0.4818 Epoch 100/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3444 - recall: 0.4513 - val_loss: 0.3218 - val_recall: 0.4939 Epoch 101/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3466 - recall: 0.4523 - val_loss: 0.3196 - val_recall: 0.4939 Epoch 102/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3513 - recall: 0.4605 - val_loss: 0.3208 - val_recall: 0.4980 Epoch 103/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4297 - val_loss: 0.3196 - val_recall: 0.5101 Epoch 104/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3477 - recall: 0.4523 - val_loss: 0.3199 - val_recall: 0.5061 Epoch 105/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3498 - recall: 0.4544 - val_loss: 0.3196 - val_recall: 0.4939 Epoch 106/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3473 - recall: 0.4533 - val_loss: 0.3204 - val_recall: 0.5182 Epoch 107/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3477 - recall: 0.4523 - val_loss: 0.3186 - val_recall: 0.5142 Epoch 108/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4379 - val_loss: 0.3199 - val_recall: 0.5344 Epoch 109/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3480 - recall: 0.4462 - val_loss: 0.3198 - val_recall: 0.4899 Epoch 110/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3481 - recall: 0.4482 - val_loss: 0.3206 - val_recall: 0.4858 Epoch 111/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3467 - recall: 0.4462 - val_loss: 0.3209 - val_recall: 0.4980 Epoch 112/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3457 - recall: 0.4451 - val_loss: 0.3204 - val_recall: 0.4980 Epoch 113/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4451 - val_loss: 0.3223 - val_recall: 0.5466 Epoch 114/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3450 - recall: 0.4636 - val_loss: 0.3213 - val_recall: 0.5020 Epoch 115/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3465 - recall: 0.4338 - val_loss: 0.3206 - val_recall: 0.5263 Epoch 116/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3472 - recall: 0.4656 - val_loss: 0.3213 - val_recall: 0.5385 Epoch 117/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3439 - recall: 0.4574 - val_loss: 0.3189 - val_recall: 0.5061 Epoch 118/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3456 - recall: 0.4585 - val_loss: 0.3200 - val_recall: 0.5304 Epoch 119/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3446 - recall: 0.4605 - val_loss: 0.3183 - val_recall: 0.5020 Epoch 120/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3471 - recall: 0.4400 - val_loss: 0.3227 - val_recall: 0.5587 Epoch 121/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3483 - recall: 0.4421 - val_loss: 0.3207 - val_recall: 0.5061 Epoch 122/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4462 - val_loss: 0.3194 - val_recall: 0.5142 Epoch 123/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3446 - recall: 0.4636 - val_loss: 0.3196 - val_recall: 0.5101 Epoch 124/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3448 - recall: 0.4626 - val_loss: 0.3195 - val_recall: 0.5101 Epoch 125/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3463 - recall: 0.4533 - val_loss: 0.3190 - val_recall: 0.5142 Epoch 126/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3470 - recall: 0.4697 - val_loss: 0.3201 - val_recall: 0.5182 Epoch 127/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3479 - recall: 0.4369 - val_loss: 0.3196 - val_recall: 0.5304 Epoch 128/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3473 - recall: 0.4564 - val_loss: 0.3187 - val_recall: 0.5263 Epoch 129/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3453 - recall: 0.4554 - val_loss: 0.3190 - val_recall: 0.5061 Epoch 130/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3501 - recall: 0.4267 - val_loss: 0.3184 - val_recall: 0.5101 Epoch 131/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3469 - recall: 0.4564 - val_loss: 0.3192 - val_recall: 0.5101 Epoch 132/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3475 - recall: 0.4431 - val_loss: 0.3198 - val_recall: 0.5223 Epoch 133/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3492 - recall: 0.4482 - val_loss: 0.3220 - val_recall: 0.5385 Epoch 134/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3400 - recall: 0.4595 - val_loss: 0.3188 - val_recall: 0.5182 Epoch 135/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3458 - recall: 0.4492 - val_loss: 0.3190 - val_recall: 0.5142 Epoch 136/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3480 - recall: 0.4574 - val_loss: 0.3214 - val_recall: 0.5506 Epoch 137/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3472 - recall: 0.4554 - val_loss: 0.3187 - val_recall: 0.5142 Epoch 138/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3472 - recall: 0.4554 - val_loss: 0.3205 - val_recall: 0.4939 Epoch 139/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3450 - recall: 0.4431 - val_loss: 0.3191 - val_recall: 0.5223
# plot loss and recall
plot_hist(history)
# determine optimal threshold
optimal_threshold_auc_roc_3 = optimal_threshold(X_val, y_val, model_3)
0.16913718
# checking model performances for this model in train and validation sets
scores_nn3 = metrics_score(model_3,X_train,X_val,y_train,y_val,threshold=optimal_threshold_auc_roc_3,model_name='Model 3')
scores_nn3
| Model | Accuracy on training set | Accuracy on validation set | Recall on training set | Recall on validation set | Precision on training set | Precision on validation set | F1 on training set | F1 on validation set | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | Model 3 | 0.784 | 0.797 | 0.774959 | 0.767157 | 0.481199 | 0.501603 | 0.59373 | 0.606589 |
# Initialize seed for random numbers
np.random.seed(1)
tf.random.set_seed(1)
#initialize the model
model_4 = Sequential()
## Adding layers
# Input Layer
model_4.add(Dense(units=10, input_dim = 11, kernel_initializer='he_normal', activation='elu')) # input of 11 columns as shown above
# Hidden layer
model_4.add(Dense(units=10, kernel_initializer='he_normal', activation='elu'))
# Adding Dropout to prevent overfitting
model_4.add(Dropout(0.5))
# Output layer
model_4.add(Dense(1, kernel_initializer='he_normal', activation='sigmoid'))
# Compile the model
model_4.compile(optimizer=Adam(learning_rate=0.0005),loss='binary_crossentropy',metrics=['Recall'])
# Summary
model_4.summary()
Model: "sequential_2" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_8 (Dense) (None, 10) 120 _________________________________________________________________ dense_9 (Dense) (None, 10) 110 _________________________________________________________________ dropout_2 (Dropout) (None, 10) 0 _________________________________________________________________ dense_10 (Dense) (None, 1) 11 ================================================================= Total params: 241 Trainable params: 241 Non-trainable params: 0 _________________________________________________________________
#fitting the model
history=model_4.fit(X_train,y_train,batch_size=32,epochs=200,validation_split=0.2,callbacks=[early_stopping])
Epoch 1/200 150/150 [==============================] - 1s 2ms/step - loss: 0.8987 - recall: 0.5415 - val_loss: 0.6007 - val_recall: 0.4575 Epoch 2/200 150/150 [==============================] - 0s 1ms/step - loss: 0.7283 - recall: 0.4113 - val_loss: 0.5095 - val_recall: 0.3320 Epoch 3/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6277 - recall: 0.3579 - val_loss: 0.4706 - val_recall: 0.2591 Epoch 4/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5714 - recall: 0.3097 - val_loss: 0.4517 - val_recall: 0.2267 Epoch 5/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5582 - recall: 0.2677 - val_loss: 0.4409 - val_recall: 0.2065 Epoch 6/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5304 - recall: 0.2554 - val_loss: 0.4348 - val_recall: 0.2065 Epoch 7/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5148 - recall: 0.2410 - val_loss: 0.4306 - val_recall: 0.2065 Epoch 8/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5050 - recall: 0.2113 - val_loss: 0.4268 - val_recall: 0.2146 Epoch 9/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4933 - recall: 0.2195 - val_loss: 0.4238 - val_recall: 0.2146 Epoch 10/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4831 - recall: 0.2195 - val_loss: 0.4212 - val_recall: 0.2146 Epoch 11/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4830 - recall: 0.2144 - val_loss: 0.4196 - val_recall: 0.2146 Epoch 12/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4806 - recall: 0.2154 - val_loss: 0.4186 - val_recall: 0.2146 Epoch 13/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4687 - recall: 0.2092 - val_loss: 0.4172 - val_recall: 0.2105 Epoch 14/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4696 - recall: 0.2021 - val_loss: 0.4159 - val_recall: 0.2146 Epoch 15/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4689 - recall: 0.2144 - val_loss: 0.4151 - val_recall: 0.2186 Epoch 16/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4663 - recall: 0.2041 - val_loss: 0.4146 - val_recall: 0.2105 Epoch 17/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4618 - recall: 0.2031 - val_loss: 0.4134 - val_recall: 0.2146 Epoch 18/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4579 - recall: 0.2021 - val_loss: 0.4115 - val_recall: 0.2267 Epoch 19/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4580 - recall: 0.2103 - val_loss: 0.4107 - val_recall: 0.2146 Epoch 20/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4521 - recall: 0.2082 - val_loss: 0.4091 - val_recall: 0.2348 Epoch 21/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4553 - recall: 0.2031 - val_loss: 0.4083 - val_recall: 0.2308 Epoch 22/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4483 - recall: 0.2092 - val_loss: 0.4071 - val_recall: 0.2389 Epoch 23/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4452 - recall: 0.2308 - val_loss: 0.4054 - val_recall: 0.2348 Epoch 24/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4501 - recall: 0.2041 - val_loss: 0.4049 - val_recall: 0.2389 Epoch 25/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4447 - recall: 0.2246 - val_loss: 0.4038 - val_recall: 0.2470 Epoch 26/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4488 - recall: 0.2133 - val_loss: 0.4036 - val_recall: 0.2348 Epoch 27/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4470 - recall: 0.2021 - val_loss: 0.4020 - val_recall: 0.2591 Epoch 28/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4363 - recall: 0.2082 - val_loss: 0.4000 - val_recall: 0.2510 Epoch 29/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4430 - recall: 0.2236 - val_loss: 0.3996 - val_recall: 0.2632 Epoch 30/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4410 - recall: 0.2031 - val_loss: 0.3985 - val_recall: 0.2551 Epoch 31/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4368 - recall: 0.2195 - val_loss: 0.3970 - val_recall: 0.2834 Epoch 32/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4312 - recall: 0.2379 - val_loss: 0.3950 - val_recall: 0.2874 Epoch 33/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4306 - recall: 0.2441 - val_loss: 0.3939 - val_recall: 0.2753 Epoch 34/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4304 - recall: 0.2359 - val_loss: 0.3925 - val_recall: 0.2834 Epoch 35/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4270 - recall: 0.2328 - val_loss: 0.3920 - val_recall: 0.2753 Epoch 36/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4284 - recall: 0.2369 - val_loss: 0.3898 - val_recall: 0.2915 Epoch 37/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4258 - recall: 0.2441 - val_loss: 0.3876 - val_recall: 0.3036 Epoch 38/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4221 - recall: 0.2523 - val_loss: 0.3871 - val_recall: 0.2955 Epoch 39/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4255 - recall: 0.2595 - val_loss: 0.3859 - val_recall: 0.2955 Epoch 40/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4239 - recall: 0.2513 - val_loss: 0.3845 - val_recall: 0.3036 Epoch 41/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4169 - recall: 0.2595 - val_loss: 0.3824 - val_recall: 0.3077 Epoch 42/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4216 - recall: 0.2595 - val_loss: 0.3823 - val_recall: 0.3239 Epoch 43/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4134 - recall: 0.2790 - val_loss: 0.3794 - val_recall: 0.3360 Epoch 44/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4144 - recall: 0.2841 - val_loss: 0.3774 - val_recall: 0.3522 Epoch 45/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4092 - recall: 0.2749 - val_loss: 0.3745 - val_recall: 0.3603 Epoch 46/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4119 - recall: 0.2831 - val_loss: 0.3733 - val_recall: 0.3644 Epoch 47/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4041 - recall: 0.3056 - val_loss: 0.3708 - val_recall: 0.3644 Epoch 48/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4033 - recall: 0.3097 - val_loss: 0.3681 - val_recall: 0.3765 Epoch 49/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4055 - recall: 0.2933 - val_loss: 0.3668 - val_recall: 0.3684 Epoch 50/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4015 - recall: 0.3333 - val_loss: 0.3639 - val_recall: 0.3644 Epoch 51/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4008 - recall: 0.3056 - val_loss: 0.3622 - val_recall: 0.3846 Epoch 52/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3983 - recall: 0.3241 - val_loss: 0.3601 - val_recall: 0.3806 Epoch 53/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3970 - recall: 0.3251 - val_loss: 0.3588 - val_recall: 0.3644 Epoch 54/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3962 - recall: 0.3128 - val_loss: 0.3562 - val_recall: 0.3927 Epoch 55/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3918 - recall: 0.3426 - val_loss: 0.3538 - val_recall: 0.3846 Epoch 56/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3901 - recall: 0.3374 - val_loss: 0.3520 - val_recall: 0.3887 Epoch 57/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3870 - recall: 0.3405 - val_loss: 0.3496 - val_recall: 0.3887 Epoch 58/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3856 - recall: 0.3303 - val_loss: 0.3475 - val_recall: 0.3927 Epoch 59/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3795 - recall: 0.3764 - val_loss: 0.3458 - val_recall: 0.3968 Epoch 60/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3785 - recall: 0.3641 - val_loss: 0.3436 - val_recall: 0.4170 Epoch 61/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3778 - recall: 0.3672 - val_loss: 0.3416 - val_recall: 0.4251 Epoch 62/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3789 - recall: 0.3764 - val_loss: 0.3402 - val_recall: 0.4332 Epoch 63/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3770 - recall: 0.3867 - val_loss: 0.3389 - val_recall: 0.4291 Epoch 64/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3687 - recall: 0.4000 - val_loss: 0.3376 - val_recall: 0.4413 Epoch 65/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3785 - recall: 0.3754 - val_loss: 0.3366 - val_recall: 0.4453 Epoch 66/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3741 - recall: 0.4010 - val_loss: 0.3357 - val_recall: 0.4453 Epoch 67/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3736 - recall: 0.4000 - val_loss: 0.3355 - val_recall: 0.4453 Epoch 68/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3722 - recall: 0.3856 - val_loss: 0.3338 - val_recall: 0.4534 Epoch 69/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3708 - recall: 0.3887 - val_loss: 0.3335 - val_recall: 0.4534 Epoch 70/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3706 - recall: 0.4010 - val_loss: 0.3327 - val_recall: 0.4534 Epoch 71/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3695 - recall: 0.4031 - val_loss: 0.3330 - val_recall: 0.4615 Epoch 72/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3686 - recall: 0.4072 - val_loss: 0.3305 - val_recall: 0.4615 Epoch 73/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3697 - recall: 0.3969 - val_loss: 0.3309 - val_recall: 0.4615 Epoch 74/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3689 - recall: 0.3877 - val_loss: 0.3305 - val_recall: 0.4615 Epoch 75/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3655 - recall: 0.4000 - val_loss: 0.3302 - val_recall: 0.4777 Epoch 76/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3677 - recall: 0.4010 - val_loss: 0.3291 - val_recall: 0.4615 Epoch 77/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3628 - recall: 0.4041 - val_loss: 0.3295 - val_recall: 0.4899 Epoch 78/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3632 - recall: 0.4010 - val_loss: 0.3283 - val_recall: 0.4899 Epoch 79/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3615 - recall: 0.4185 - val_loss: 0.3279 - val_recall: 0.4818 Epoch 80/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3609 - recall: 0.4287 - val_loss: 0.3276 - val_recall: 0.4818 Epoch 81/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3533 - recall: 0.4287 - val_loss: 0.3271 - val_recall: 0.5020 Epoch 82/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3627 - recall: 0.4287 - val_loss: 0.3265 - val_recall: 0.5020 Epoch 83/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3597 - recall: 0.4226 - val_loss: 0.3265 - val_recall: 0.4980 Epoch 84/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3541 - recall: 0.4492 - val_loss: 0.3260 - val_recall: 0.4899 Epoch 85/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3625 - recall: 0.4379 - val_loss: 0.3265 - val_recall: 0.5061 Epoch 86/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3613 - recall: 0.4103 - val_loss: 0.3265 - val_recall: 0.5020 Epoch 87/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3542 - recall: 0.4390 - val_loss: 0.3257 - val_recall: 0.5061 Epoch 88/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3584 - recall: 0.4256 - val_loss: 0.3259 - val_recall: 0.4980 Epoch 89/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3589 - recall: 0.4236 - val_loss: 0.3254 - val_recall: 0.5020 Epoch 90/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3598 - recall: 0.4133 - val_loss: 0.3258 - val_recall: 0.5101 Epoch 91/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3568 - recall: 0.4318 - val_loss: 0.3250 - val_recall: 0.5101 Epoch 92/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3588 - recall: 0.4215 - val_loss: 0.3256 - val_recall: 0.4939 Epoch 93/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3577 - recall: 0.4000 - val_loss: 0.3249 - val_recall: 0.5101 Epoch 94/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3571 - recall: 0.4256 - val_loss: 0.3244 - val_recall: 0.5061 Epoch 95/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3541 - recall: 0.4297 - val_loss: 0.3242 - val_recall: 0.5061 Epoch 96/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3581 - recall: 0.4369 - val_loss: 0.3244 - val_recall: 0.5061 Epoch 97/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3610 - recall: 0.4144 - val_loss: 0.3247 - val_recall: 0.5101 Epoch 98/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3551 - recall: 0.4133 - val_loss: 0.3254 - val_recall: 0.5142 Epoch 99/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3593 - recall: 0.4236 - val_loss: 0.3242 - val_recall: 0.5142 Epoch 100/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3539 - recall: 0.4318 - val_loss: 0.3241 - val_recall: 0.5020 Epoch 101/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3536 - recall: 0.4379 - val_loss: 0.3243 - val_recall: 0.5101 Epoch 102/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3568 - recall: 0.4256 - val_loss: 0.3246 - val_recall: 0.5182 Epoch 103/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3537 - recall: 0.4246 - val_loss: 0.3247 - val_recall: 0.5061 Epoch 104/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3571 - recall: 0.4277 - val_loss: 0.3246 - val_recall: 0.5101 Epoch 105/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3591 - recall: 0.4277 - val_loss: 0.3245 - val_recall: 0.5101 Epoch 106/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3579 - recall: 0.4246 - val_loss: 0.3244 - val_recall: 0.5182 Epoch 107/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3556 - recall: 0.4297 - val_loss: 0.3241 - val_recall: 0.5223 Epoch 108/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3561 - recall: 0.4123 - val_loss: 0.3242 - val_recall: 0.5182 Epoch 109/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3529 - recall: 0.4236 - val_loss: 0.3239 - val_recall: 0.4899 Epoch 110/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3529 - recall: 0.4318 - val_loss: 0.3243 - val_recall: 0.5142 Epoch 111/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3559 - recall: 0.4226 - val_loss: 0.3243 - val_recall: 0.5182 Epoch 112/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3523 - recall: 0.4338 - val_loss: 0.3230 - val_recall: 0.5101 Epoch 113/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3574 - recall: 0.4164 - val_loss: 0.3239 - val_recall: 0.5142 Epoch 114/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3552 - recall: 0.4400 - val_loss: 0.3246 - val_recall: 0.5223 Epoch 115/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3553 - recall: 0.4246 - val_loss: 0.3245 - val_recall: 0.5182 Epoch 116/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3551 - recall: 0.4287 - val_loss: 0.3238 - val_recall: 0.5142 Epoch 117/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3542 - recall: 0.4154 - val_loss: 0.3231 - val_recall: 0.5061 Epoch 118/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3560 - recall: 0.4287 - val_loss: 0.3245 - val_recall: 0.5223 Epoch 119/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3535 - recall: 0.4287 - val_loss: 0.3230 - val_recall: 0.5101 Epoch 120/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4297 - val_loss: 0.3237 - val_recall: 0.5223 Epoch 121/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3508 - recall: 0.4287 - val_loss: 0.3240 - val_recall: 0.5182 Epoch 122/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3551 - recall: 0.4154 - val_loss: 0.3237 - val_recall: 0.5061 Epoch 123/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3518 - recall: 0.4092 - val_loss: 0.3232 - val_recall: 0.5142 Epoch 124/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3515 - recall: 0.4503 - val_loss: 0.3232 - val_recall: 0.5142 Epoch 125/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3495 - recall: 0.4318 - val_loss: 0.3228 - val_recall: 0.5101 Epoch 126/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3524 - recall: 0.4451 - val_loss: 0.3224 - val_recall: 0.5061 Epoch 127/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3550 - recall: 0.4246 - val_loss: 0.3232 - val_recall: 0.5061 Epoch 128/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3549 - recall: 0.4215 - val_loss: 0.3225 - val_recall: 0.5101 Epoch 129/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4441 - val_loss: 0.3223 - val_recall: 0.5142 Epoch 130/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3543 - recall: 0.4164 - val_loss: 0.3228 - val_recall: 0.5061 Epoch 131/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3502 - recall: 0.4318 - val_loss: 0.3226 - val_recall: 0.5061 Epoch 132/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3509 - recall: 0.4205 - val_loss: 0.3232 - val_recall: 0.5182 Epoch 133/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4379 - val_loss: 0.3227 - val_recall: 0.5182 Epoch 134/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3460 - recall: 0.4369 - val_loss: 0.3218 - val_recall: 0.5101 Epoch 135/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3495 - recall: 0.4328 - val_loss: 0.3217 - val_recall: 0.5061 Epoch 136/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3526 - recall: 0.4390 - val_loss: 0.3229 - val_recall: 0.5263 Epoch 137/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3535 - recall: 0.4369 - val_loss: 0.3219 - val_recall: 0.5061 Epoch 138/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3534 - recall: 0.4441 - val_loss: 0.3226 - val_recall: 0.5061 Epoch 139/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3548 - recall: 0.4297 - val_loss: 0.3220 - val_recall: 0.5061 Epoch 140/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3511 - recall: 0.4349 - val_loss: 0.3219 - val_recall: 0.5182 Epoch 141/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3550 - recall: 0.4246 - val_loss: 0.3219 - val_recall: 0.5101 Epoch 142/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3489 - recall: 0.4287 - val_loss: 0.3210 - val_recall: 0.5142 Epoch 143/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3511 - recall: 0.4369 - val_loss: 0.3214 - val_recall: 0.5182 Epoch 144/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3513 - recall: 0.4267 - val_loss: 0.3214 - val_recall: 0.5142 Epoch 145/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4205 - val_loss: 0.3220 - val_recall: 0.5182 Epoch 146/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3511 - recall: 0.4277 - val_loss: 0.3221 - val_recall: 0.5223 Epoch 147/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3516 - recall: 0.4308 - val_loss: 0.3214 - val_recall: 0.5182 Epoch 148/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3540 - recall: 0.4215 - val_loss: 0.3210 - val_recall: 0.5101 Epoch 149/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3484 - recall: 0.4297 - val_loss: 0.3206 - val_recall: 0.5101 Epoch 150/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3485 - recall: 0.4359 - val_loss: 0.3218 - val_recall: 0.5182 Epoch 151/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3471 - recall: 0.4369 - val_loss: 0.3218 - val_recall: 0.5182 Epoch 152/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3513 - recall: 0.4379 - val_loss: 0.3208 - val_recall: 0.5142 Epoch 153/200 150/150 [==============================] - 0s 2ms/step - loss: 0.3531 - recall: 0.4328 - val_loss: 0.3215 - val_recall: 0.5182 Epoch 154/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3532 - recall: 0.4226 - val_loss: 0.3211 - val_recall: 0.5182 Epoch 155/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3521 - recall: 0.4462 - val_loss: 0.3212 - val_recall: 0.5101 Epoch 156/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3496 - recall: 0.4379 - val_loss: 0.3206 - val_recall: 0.5061 Epoch 157/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3500 - recall: 0.4297 - val_loss: 0.3204 - val_recall: 0.5101 Epoch 158/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3480 - recall: 0.4205 - val_loss: 0.3212 - val_recall: 0.5182 Epoch 159/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3481 - recall: 0.4462 - val_loss: 0.3218 - val_recall: 0.5182 Epoch 160/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3512 - recall: 0.4400 - val_loss: 0.3214 - val_recall: 0.5101 Epoch 161/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3475 - recall: 0.4431 - val_loss: 0.3224 - val_recall: 0.5142 Epoch 162/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3520 - recall: 0.4256 - val_loss: 0.3212 - val_recall: 0.5304 Epoch 163/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3501 - recall: 0.4472 - val_loss: 0.3216 - val_recall: 0.5061 Epoch 164/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3483 - recall: 0.4297 - val_loss: 0.3216 - val_recall: 0.5142 Epoch 165/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3501 - recall: 0.4328 - val_loss: 0.3208 - val_recall: 0.5061 Epoch 166/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3484 - recall: 0.4185 - val_loss: 0.3212 - val_recall: 0.5101 Epoch 167/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3536 - recall: 0.4236 - val_loss: 0.3218 - val_recall: 0.4980 Epoch 168/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3473 - recall: 0.4144 - val_loss: 0.3211 - val_recall: 0.5101 Epoch 169/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3492 - recall: 0.4359 - val_loss: 0.3209 - val_recall: 0.5020 Epoch 170/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3532 - recall: 0.4195 - val_loss: 0.3217 - val_recall: 0.5101 Epoch 171/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4400 - val_loss: 0.3215 - val_recall: 0.5142 Epoch 172/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3494 - recall: 0.4267 - val_loss: 0.3208 - val_recall: 0.5061 Epoch 173/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3454 - recall: 0.4544 - val_loss: 0.3203 - val_recall: 0.5101 Epoch 174/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3500 - recall: 0.4369 - val_loss: 0.3203 - val_recall: 0.5061 Epoch 175/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3517 - recall: 0.4164 - val_loss: 0.3203 - val_recall: 0.5061 Epoch 176/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3498 - recall: 0.4328 - val_loss: 0.3213 - val_recall: 0.5142 Epoch 177/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3495 - recall: 0.4410 - val_loss: 0.3221 - val_recall: 0.5182 Epoch 178/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3494 - recall: 0.4369 - val_loss: 0.3216 - val_recall: 0.5142 Epoch 179/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3480 - recall: 0.4379 - val_loss: 0.3209 - val_recall: 0.5101 Epoch 180/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3504 - recall: 0.4328 - val_loss: 0.3218 - val_recall: 0.5263 Epoch 181/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3472 - recall: 0.4513 - val_loss: 0.3218 - val_recall: 0.5182 Epoch 182/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3493 - recall: 0.4410 - val_loss: 0.3212 - val_recall: 0.5020 Epoch 183/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3508 - recall: 0.4297 - val_loss: 0.3219 - val_recall: 0.5061 Epoch 184/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3443 - recall: 0.4451 - val_loss: 0.3207 - val_recall: 0.5223 Epoch 185/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3468 - recall: 0.4523 - val_loss: 0.3206 - val_recall: 0.5142 Epoch 186/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3489 - recall: 0.4421 - val_loss: 0.3211 - val_recall: 0.5101 Epoch 187/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3533 - recall: 0.4349 - val_loss: 0.3214 - val_recall: 0.5182 Epoch 188/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3506 - recall: 0.4144 - val_loss: 0.3209 - val_recall: 0.5182 Epoch 189/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3462 - recall: 0.4359 - val_loss: 0.3206 - val_recall: 0.5101 Epoch 190/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3510 - recall: 0.4431 - val_loss: 0.3220 - val_recall: 0.5020 Epoch 191/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3503 - recall: 0.4246 - val_loss: 0.3219 - val_recall: 0.5223 Epoch 192/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3505 - recall: 0.4369 - val_loss: 0.3214 - val_recall: 0.5142 Epoch 193/200 150/150 [==============================] - 0s 1ms/step - loss: 0.3488 - recall: 0.4379 - val_loss: 0.3221 - val_recall: 0.5182
# plot loss and recall
plot_hist(history)
# determine optimal threshold
optimal_threshold_auc_roc_4 = optimal_threshold(X_val, y_val, model_4)
0.19439223
# checking model performances for this model in train and validation sets
scores_nn4 = metrics_score(model_4,X_train,X_val,y_train,y_val,threshold=optimal_threshold_auc_roc_4,model_name='Model 4')
scores_nn4
| Model | Accuracy on training set | Accuracy on validation set | Recall on training set | Recall on validation set | Precision on training set | Precision on validation set | F1 on training set | F1 on validation set | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | Model 4 | 0.7895 | 0.797 | 0.760229 | 0.757353 | 0.489205 | 0.501623 | 0.595322 | 0.603516 |
# calculate class weights
class_weights = class_weight.compute_class_weight('balanced', np.unique(y_train), np.array([y_train.iloc[i] for i in range(len(y_train))]))
class_weights = dict(enumerate(class_weights))
class_weights
{0: 0.6278777731268314, 1: 2.454991816693944}
# Initialize seed for random numbers
np.random.seed(1)
tf.random.set_seed(1)
# Create a copy of the model (with freshly initialized weights)
model_5 = tf.keras.models.clone_model(model_4)
# change learning rate to 0.0005
model_5.compile(optimizer=Adam(learning_rate=0.0005),loss='binary_crossentropy',metrics=['Recall'])
#fitting the model
history = model_5.fit(X_train,y_train,batch_size=32,epochs=200,validation_split=0.2,class_weight=class_weights,callbacks=[early_stopping])
Epoch 1/200 150/150 [==============================] - 1s 3ms/step - loss: 0.9369 - recall: 0.5938 - val_loss: 0.7215 - val_recall: 0.7045 Epoch 2/200 150/150 [==============================] - 0s 1ms/step - loss: 0.8486 - recall: 0.5754 - val_loss: 0.6782 - val_recall: 0.7490 Epoch 3/200 150/150 [==============================] - 0s 1ms/step - loss: 0.7705 - recall: 0.6082 - val_loss: 0.6483 - val_recall: 0.7611 Epoch 4/200 150/150 [==============================] - 0s 2ms/step - loss: 0.7304 - recall: 0.5969 - val_loss: 0.6293 - val_recall: 0.7692 Epoch 5/200 150/150 [==============================] - 0s 1ms/step - loss: 0.7212 - recall: 0.6051 - val_loss: 0.6159 - val_recall: 0.7692 Epoch 6/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6936 - recall: 0.6113 - val_loss: 0.6098 - val_recall: 0.7652 Epoch 7/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6749 - recall: 0.6051 - val_loss: 0.6021 - val_recall: 0.7652 Epoch 8/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6669 - recall: 0.6195 - val_loss: 0.5949 - val_recall: 0.7692 Epoch 9/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6528 - recall: 0.6154 - val_loss: 0.5876 - val_recall: 0.7449 Epoch 10/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6428 - recall: 0.6154 - val_loss: 0.5870 - val_recall: 0.7530 Epoch 11/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6376 - recall: 0.6328 - val_loss: 0.5855 - val_recall: 0.7449 Epoch 12/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6332 - recall: 0.6164 - val_loss: 0.5861 - val_recall: 0.7490 Epoch 13/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6214 - recall: 0.6379 - val_loss: 0.5818 - val_recall: 0.7449 Epoch 14/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6212 - recall: 0.6431 - val_loss: 0.5819 - val_recall: 0.7490 Epoch 15/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6216 - recall: 0.6379 - val_loss: 0.5836 - val_recall: 0.7530 Epoch 16/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6220 - recall: 0.6482 - val_loss: 0.5802 - val_recall: 0.7530 Epoch 17/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6190 - recall: 0.6451 - val_loss: 0.5810 - val_recall: 0.7530 Epoch 18/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6086 - recall: 0.6492 - val_loss: 0.5801 - val_recall: 0.7571 Epoch 19/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6112 - recall: 0.6503 - val_loss: 0.5762 - val_recall: 0.7571 Epoch 20/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6031 - recall: 0.6554 - val_loss: 0.5770 - val_recall: 0.7571 Epoch 21/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6113 - recall: 0.6533 - val_loss: 0.5775 - val_recall: 0.7571 Epoch 22/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6003 - recall: 0.6554 - val_loss: 0.5778 - val_recall: 0.7652 Epoch 23/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5971 - recall: 0.6821 - val_loss: 0.5714 - val_recall: 0.7652 Epoch 24/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6023 - recall: 0.6636 - val_loss: 0.5721 - val_recall: 0.7652 Epoch 25/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5967 - recall: 0.6759 - val_loss: 0.5744 - val_recall: 0.7652 Epoch 26/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6098 - recall: 0.6410 - val_loss: 0.5738 - val_recall: 0.7652 Epoch 27/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6002 - recall: 0.6779 - val_loss: 0.5753 - val_recall: 0.7692 Epoch 28/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5917 - recall: 0.6759 - val_loss: 0.5671 - val_recall: 0.7652 Epoch 29/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6013 - recall: 0.6697 - val_loss: 0.5683 - val_recall: 0.7652 Epoch 30/200 150/150 [==============================] - 0s 1ms/step - loss: 0.6002 - recall: 0.6769 - val_loss: 0.5675 - val_recall: 0.7692 Epoch 31/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5925 - recall: 0.6728 - val_loss: 0.5675 - val_recall: 0.7652 Epoch 32/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5872 - recall: 0.6769 - val_loss: 0.5651 - val_recall: 0.7611 Epoch 33/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5879 - recall: 0.6667 - val_loss: 0.5628 - val_recall: 0.7611 Epoch 34/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5867 - recall: 0.6646 - val_loss: 0.5614 - val_recall: 0.7611 Epoch 35/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5877 - recall: 0.6718 - val_loss: 0.5567 - val_recall: 0.7571 Epoch 36/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5845 - recall: 0.6821 - val_loss: 0.5564 - val_recall: 0.7652 Epoch 37/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5834 - recall: 0.6738 - val_loss: 0.5561 - val_recall: 0.7652 Epoch 38/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5824 - recall: 0.6769 - val_loss: 0.5522 - val_recall: 0.7571 Epoch 39/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5869 - recall: 0.6810 - val_loss: 0.5514 - val_recall: 0.7652 Epoch 40/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5835 - recall: 0.6790 - val_loss: 0.5521 - val_recall: 0.7571 Epoch 41/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5779 - recall: 0.6728 - val_loss: 0.5479 - val_recall: 0.7611 Epoch 42/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5788 - recall: 0.6738 - val_loss: 0.5528 - val_recall: 0.7571 Epoch 43/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5684 - recall: 0.6892 - val_loss: 0.5503 - val_recall: 0.7611 Epoch 44/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5760 - recall: 0.6697 - val_loss: 0.5481 - val_recall: 0.7571 Epoch 45/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5704 - recall: 0.6903 - val_loss: 0.5441 - val_recall: 0.7611 Epoch 46/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5723 - recall: 0.6862 - val_loss: 0.5401 - val_recall: 0.7490 Epoch 47/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5622 - recall: 0.6810 - val_loss: 0.5385 - val_recall: 0.7530 Epoch 48/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5643 - recall: 0.6985 - val_loss: 0.5366 - val_recall: 0.7490 Epoch 49/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5623 - recall: 0.6872 - val_loss: 0.5310 - val_recall: 0.7530 Epoch 50/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5609 - recall: 0.6933 - val_loss: 0.5304 - val_recall: 0.7571 Epoch 51/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5602 - recall: 0.6892 - val_loss: 0.5274 - val_recall: 0.7652 Epoch 52/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5579 - recall: 0.6933 - val_loss: 0.5270 - val_recall: 0.7652 Epoch 53/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5588 - recall: 0.6903 - val_loss: 0.5250 - val_recall: 0.7490 Epoch 54/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5552 - recall: 0.6810 - val_loss: 0.5262 - val_recall: 0.7692 Epoch 55/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5486 - recall: 0.7108 - val_loss: 0.5197 - val_recall: 0.7692 Epoch 56/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5462 - recall: 0.6944 - val_loss: 0.5175 - val_recall: 0.7652 Epoch 57/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5434 - recall: 0.7067 - val_loss: 0.5151 - val_recall: 0.7652 Epoch 58/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5408 - recall: 0.7087 - val_loss: 0.5097 - val_recall: 0.7571 Epoch 59/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5348 - recall: 0.7262 - val_loss: 0.5064 - val_recall: 0.7571 Epoch 60/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5333 - recall: 0.7138 - val_loss: 0.5007 - val_recall: 0.7571 Epoch 61/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5282 - recall: 0.7108 - val_loss: 0.5012 - val_recall: 0.7611 Epoch 62/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5275 - recall: 0.7097 - val_loss: 0.4978 - val_recall: 0.7773 Epoch 63/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5275 - recall: 0.7231 - val_loss: 0.4934 - val_recall: 0.7692 Epoch 64/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5130 - recall: 0.7292 - val_loss: 0.4884 - val_recall: 0.7854 Epoch 65/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5247 - recall: 0.7097 - val_loss: 0.4867 - val_recall: 0.7895 Epoch 66/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5201 - recall: 0.7231 - val_loss: 0.4823 - val_recall: 0.7854 Epoch 67/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5156 - recall: 0.7313 - val_loss: 0.4835 - val_recall: 0.8016 Epoch 68/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5196 - recall: 0.7210 - val_loss: 0.4833 - val_recall: 0.8057 Epoch 69/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5162 - recall: 0.7313 - val_loss: 0.4795 - val_recall: 0.8016 Epoch 70/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5137 - recall: 0.7344 - val_loss: 0.4778 - val_recall: 0.7935 Epoch 71/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5111 - recall: 0.7292 - val_loss: 0.4794 - val_recall: 0.7935 Epoch 72/200 150/150 [==============================] - 0s 2ms/step - loss: 0.5100 - recall: 0.7374 - val_loss: 0.4701 - val_recall: 0.7814 Epoch 73/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5096 - recall: 0.7467 - val_loss: 0.4702 - val_recall: 0.7814 Epoch 74/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5067 - recall: 0.7374 - val_loss: 0.4665 - val_recall: 0.7773 Epoch 75/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5025 - recall: 0.7467 - val_loss: 0.4666 - val_recall: 0.7773 Epoch 76/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5053 - recall: 0.7344 - val_loss: 0.4648 - val_recall: 0.7692 Epoch 77/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5015 - recall: 0.7385 - val_loss: 0.4707 - val_recall: 0.7895 Epoch 78/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5013 - recall: 0.7415 - val_loss: 0.4657 - val_recall: 0.7773 Epoch 79/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4968 - recall: 0.7323 - val_loss: 0.4645 - val_recall: 0.7733 Epoch 80/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4972 - recall: 0.7426 - val_loss: 0.4543 - val_recall: 0.7611 Epoch 81/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4904 - recall: 0.7354 - val_loss: 0.4613 - val_recall: 0.7733 Epoch 82/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4982 - recall: 0.7528 - val_loss: 0.4598 - val_recall: 0.7611 Epoch 83/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4945 - recall: 0.7405 - val_loss: 0.4577 - val_recall: 0.7733 Epoch 84/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4896 - recall: 0.7549 - val_loss: 0.4494 - val_recall: 0.7611 Epoch 85/200 150/150 [==============================] - 0s 1ms/step - loss: 0.5017 - recall: 0.7374 - val_loss: 0.4534 - val_recall: 0.7611 Epoch 86/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4938 - recall: 0.7405 - val_loss: 0.4605 - val_recall: 0.7692 Epoch 87/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4845 - recall: 0.7497 - val_loss: 0.4547 - val_recall: 0.7611 Epoch 88/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4922 - recall: 0.7374 - val_loss: 0.4498 - val_recall: 0.7652 Epoch 89/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4945 - recall: 0.7323 - val_loss: 0.4502 - val_recall: 0.7611 Epoch 90/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4911 - recall: 0.7405 - val_loss: 0.4527 - val_recall: 0.7611 Epoch 91/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4876 - recall: 0.7436 - val_loss: 0.4510 - val_recall: 0.7611 Epoch 92/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4934 - recall: 0.7179 - val_loss: 0.4502 - val_recall: 0.7571 Epoch 93/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4887 - recall: 0.7569 - val_loss: 0.4520 - val_recall: 0.7611 Epoch 94/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4896 - recall: 0.7446 - val_loss: 0.4475 - val_recall: 0.7611 Epoch 95/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4847 - recall: 0.7262 - val_loss: 0.4472 - val_recall: 0.7571 Epoch 96/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4904 - recall: 0.7436 - val_loss: 0.4484 - val_recall: 0.7571 Epoch 97/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4913 - recall: 0.7518 - val_loss: 0.4534 - val_recall: 0.7611 Epoch 98/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4861 - recall: 0.7528 - val_loss: 0.4523 - val_recall: 0.7611 Epoch 99/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4919 - recall: 0.7344 - val_loss: 0.4511 - val_recall: 0.7611 Epoch 100/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4838 - recall: 0.7446 - val_loss: 0.4453 - val_recall: 0.7530 Epoch 101/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4848 - recall: 0.7374 - val_loss: 0.4414 - val_recall: 0.7571 Epoch 102/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4832 - recall: 0.7446 - val_loss: 0.4483 - val_recall: 0.7571 Epoch 103/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4848 - recall: 0.7528 - val_loss: 0.4469 - val_recall: 0.7571 Epoch 104/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4909 - recall: 0.7446 - val_loss: 0.4470 - val_recall: 0.7571 Epoch 105/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4896 - recall: 0.7374 - val_loss: 0.4504 - val_recall: 0.7571 Epoch 106/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4877 - recall: 0.7456 - val_loss: 0.4518 - val_recall: 0.7611 Epoch 107/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4864 - recall: 0.7467 - val_loss: 0.4533 - val_recall: 0.7611 Epoch 108/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4882 - recall: 0.7292 - val_loss: 0.4504 - val_recall: 0.7530 Epoch 109/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4810 - recall: 0.7456 - val_loss: 0.4388 - val_recall: 0.7530 Epoch 110/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4812 - recall: 0.7508 - val_loss: 0.4456 - val_recall: 0.7530 Epoch 111/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4854 - recall: 0.7446 - val_loss: 0.4477 - val_recall: 0.7530 Epoch 112/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4803 - recall: 0.7477 - val_loss: 0.4445 - val_recall: 0.7530 Epoch 113/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4862 - recall: 0.7518 - val_loss: 0.4461 - val_recall: 0.7571 Epoch 114/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4826 - recall: 0.7405 - val_loss: 0.4601 - val_recall: 0.7773 Epoch 115/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4822 - recall: 0.7569 - val_loss: 0.4538 - val_recall: 0.7692 Epoch 116/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4793 - recall: 0.7610 - val_loss: 0.4469 - val_recall: 0.7652 Epoch 117/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4826 - recall: 0.7446 - val_loss: 0.4442 - val_recall: 0.7611 Epoch 118/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4865 - recall: 0.7508 - val_loss: 0.4548 - val_recall: 0.7733 Epoch 119/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4833 - recall: 0.7487 - val_loss: 0.4427 - val_recall: 0.7530 Epoch 120/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4768 - recall: 0.7364 - val_loss: 0.4494 - val_recall: 0.7692 Epoch 121/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4780 - recall: 0.7364 - val_loss: 0.4463 - val_recall: 0.7652 Epoch 122/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4824 - recall: 0.7426 - val_loss: 0.4428 - val_recall: 0.7611 Epoch 123/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4769 - recall: 0.7456 - val_loss: 0.4407 - val_recall: 0.7652 Epoch 124/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4794 - recall: 0.7508 - val_loss: 0.4459 - val_recall: 0.7652 Epoch 125/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4740 - recall: 0.7487 - val_loss: 0.4432 - val_recall: 0.7571 Epoch 126/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4802 - recall: 0.7518 - val_loss: 0.4384 - val_recall: 0.7611 Epoch 127/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4832 - recall: 0.7395 - val_loss: 0.4464 - val_recall: 0.7611 Epoch 128/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4817 - recall: 0.7518 - val_loss: 0.4434 - val_recall: 0.7652 Epoch 129/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4775 - recall: 0.7528 - val_loss: 0.4458 - val_recall: 0.7773 Epoch 130/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4818 - recall: 0.7590 - val_loss: 0.4429 - val_recall: 0.7652 Epoch 131/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4772 - recall: 0.7405 - val_loss: 0.4415 - val_recall: 0.7733 Epoch 132/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4727 - recall: 0.7672 - val_loss: 0.4465 - val_recall: 0.7773 Epoch 133/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4730 - recall: 0.7436 - val_loss: 0.4427 - val_recall: 0.7652 Epoch 134/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4711 - recall: 0.7559 - val_loss: 0.4425 - val_recall: 0.7652 Epoch 135/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4747 - recall: 0.7549 - val_loss: 0.4417 - val_recall: 0.7652 Epoch 136/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4801 - recall: 0.7395 - val_loss: 0.4548 - val_recall: 0.7814 Epoch 137/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4783 - recall: 0.7477 - val_loss: 0.4442 - val_recall: 0.7652 Epoch 138/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4795 - recall: 0.7364 - val_loss: 0.4471 - val_recall: 0.7692 Epoch 139/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4798 - recall: 0.7446 - val_loss: 0.4433 - val_recall: 0.7692 Epoch 140/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4764 - recall: 0.7508 - val_loss: 0.4433 - val_recall: 0.7652 Epoch 141/200 150/150 [==============================] - 0s 2ms/step - loss: 0.4856 - recall: 0.7528 - val_loss: 0.4404 - val_recall: 0.7652 Epoch 142/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4713 - recall: 0.7508 - val_loss: 0.4428 - val_recall: 0.7692 Epoch 143/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4740 - recall: 0.7549 - val_loss: 0.4399 - val_recall: 0.7692 Epoch 144/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4765 - recall: 0.7436 - val_loss: 0.4410 - val_recall: 0.7692 Epoch 145/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4738 - recall: 0.7467 - val_loss: 0.4468 - val_recall: 0.7733 Epoch 146/200 150/150 [==============================] - 0s 1ms/step - loss: 0.4747 - recall: 0.7579 - val_loss: 0.4490 - val_recall: 0.7854
# plot loss and recall
plot_hist(history)
# determine optimal threshold
optimal_threshold_auc_roc_5 = optimal_threshold(X_val, y_val, model_5)
0.48185006
# checking model performances for this model in train and validation sets
scores_nn5 = metrics_score(model_5,X_train,X_val,y_train,y_val,threshold=optimal_threshold_auc_roc_5,model_name='Model 5')
scores_nn5
| Model | Accuracy on training set | Accuracy on validation set | Recall on training set | Recall on validation set | Precision on training set | Precision on validation set | F1 on training set | F1 on validation set | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | Model 5 | 0.783833 | 0.792 | 0.758592 | 0.762255 | 0.48056 | 0.493651 | 0.588385 | 0.599229 |
Now, we are going to compare all models on test set
# Function to calculate different metric scores of the classification model
def metrics_score_test(model,test,test_y,threshold=0.5,model_name=''):
'''
Inputs:
model: classifier to predict values of X
test: Independent features in test sets
test_y: Dependent variable in test sets
threshold: thresold for classifiying the observation as 1
model_name: Name of the model
'''
# Make the prediction of the model in train and test sets
pred_test = (model.predict(test)>threshold)
# Calculate scores and save them in a dictionary
score_dict = {'Model':model_name,
'Accuracy on test set': accuracy_score(pred_test,test_y),
'Recall on test set': recall_score(test_y,pred_test),
'Precision on test set': precision_score(test_y,pred_test),
'F1 on test set': f1_score(test_y,pred_test)
}
# Create a dataframe with scores
scores = pd.DataFrame(score_dict,index=[0])
# return the df
return scores
# Create dataframe with scores on Test dataset for all models
all_scores = pd.concat(
[metrics_score_test(model,X_test,y_test,threshold=optimal_threshold_auc_roc_1,model_name='1st Model - lr=0.01'),
metrics_score_test(model_2,X_test,y_test,threshold=optimal_threshold_auc_roc_2,model_name='2nd Model - lr = 0.0005'),
metrics_score_test(model_3,X_test,y_test,threshold=optimal_threshold_auc_roc_3,model_name='3rd Model - Extra hidden layers - lr = 0.0005'),
metrics_score_test(model_4,X_test,y_test,threshold=optimal_threshold_auc_roc_4,model_name='4th Model - Weigth initialization - lr = 0.0005'),
metrics_score_test(model_5,X_test,y_test,threshold=optimal_threshold_auc_roc_5,model_name='5th Model - Class Imbalance - lr = 0.0005'),
],
axis=0,
ignore_index=True)
all_scores.sort_values(by=['Recall on test set'], ascending=False)
| Model | Accuracy on test set | Recall on test set | Precision on test set | F1 on test set | |
|---|---|---|---|---|---|
| 2 | 3rd Model - Extra hidden layers - lr = 0.0005 | 0.7695 | 0.746929 | 0.459215 | 0.568756 |
| 1 | 2nd Model - lr = 0.0005 | 0.7675 | 0.744472 | 0.456325 | 0.565826 |
| 4 | 5th Model - Class Imbalance - lr = 0.0005 | 0.7795 | 0.737101 | 0.473186 | 0.576369 |
| 0 | 1st Model - lr=0.01 | 0.7595 | 0.734644 | 0.444940 | 0.554217 |
| 3 | 4th Model - Weigth initialization - lr = 0.0005 | 0.7785 | 0.729730 | 0.471429 | 0.572806 |
Observations
make_confusion_matrix(model_3,X_test,y_test,threshold=optimal_threshold_auc_roc_3,labels=[1, 0])